37031 1727204376.84373: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 37031 1727204376.84810: Added group all to inventory 37031 1727204376.84812: Added group ungrouped to inventory 37031 1727204376.84817: Group all now contains ungrouped 37031 1727204376.84820: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 37031 1727204377.01962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 37031 1727204377.02026: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 37031 1727204377.02050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 37031 1727204377.02113: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 37031 1727204377.02188: Loaded config def from plugin (inventory/script) 37031 1727204377.02190: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 37031 1727204377.02231: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 37031 1727204377.02321: Loaded config def from plugin (inventory/yaml) 37031 1727204377.02323: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 37031 1727204377.02413: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 37031 1727204377.02834: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 37031 1727204377.02838: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 37031 1727204377.02841: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 37031 1727204377.02847: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 37031 1727204377.02851: Loading data from /tmp/network-M6W/inventory-5vW.yml 37031 1727204377.02921: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 37031 1727204377.02992: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 37031 1727204377.03031: Loading data from /tmp/network-M6W/inventory-5vW.yml 37031 1727204377.03117: group all already in inventory 37031 1727204377.03125: set inventory_file for managed-node1 37031 1727204377.03129: set inventory_dir for managed-node1 37031 1727204377.03130: Added host managed-node1 to inventory 37031 1727204377.03132: Added host managed-node1 to group all 37031 1727204377.03133: set ansible_host for managed-node1 37031 1727204377.03134: set ansible_ssh_extra_args for managed-node1 37031 1727204377.03138: set inventory_file for managed-node2 37031 1727204377.03141: set inventory_dir for managed-node2 37031 1727204377.03142: Added host managed-node2 to inventory 37031 1727204377.03143: Added host managed-node2 to group all 37031 1727204377.03144: set ansible_host for managed-node2 37031 1727204377.03145: set ansible_ssh_extra_args for managed-node2 37031 1727204377.03148: set inventory_file for managed-node3 37031 1727204377.03150: set inventory_dir for managed-node3 37031 1727204377.03151: Added host managed-node3 to inventory 37031 1727204377.03152: Added host managed-node3 to group all 37031 1727204377.03153: set ansible_host for managed-node3 37031 1727204377.03154: set ansible_ssh_extra_args for managed-node3 37031 1727204377.03157: Reconcile groups and hosts in inventory. 37031 1727204377.03161: Group ungrouped now contains managed-node1 37031 1727204377.03163: Group ungrouped now contains managed-node2 37031 1727204377.03166: Group ungrouped now contains managed-node3 37031 1727204377.03242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 37031 1727204377.03367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 37031 1727204377.03415: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 37031 1727204377.03442: Loaded config def from plugin (vars/host_group_vars) 37031 1727204377.03444: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 37031 1727204377.03451: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 37031 1727204377.03458: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 37031 1727204377.03501: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 37031 1727204377.03848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204377.03940: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 37031 1727204377.03981: Loaded config def from plugin (connection/local) 37031 1727204377.03984: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 37031 1727204377.04572: Loaded config def from plugin (connection/paramiko_ssh) 37031 1727204377.04576: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 37031 1727204377.05531: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 37031 1727204377.05575: Loaded config def from plugin (connection/psrp) 37031 1727204377.05579: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 37031 1727204377.06293: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 37031 1727204377.06333: Loaded config def from plugin (connection/ssh) 37031 1727204377.06336: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 37031 1727204377.06726: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 37031 1727204377.06767: Loaded config def from plugin (connection/winrm) 37031 1727204377.06770: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 37031 1727204377.06804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 37031 1727204377.06872: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 37031 1727204377.06937: Loaded config def from plugin (shell/cmd) 37031 1727204377.06940: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 37031 1727204377.06970: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 37031 1727204377.07034: Loaded config def from plugin (shell/powershell) 37031 1727204377.07037: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 37031 1727204377.07092: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 37031 1727204377.07268: Loaded config def from plugin (shell/sh) 37031 1727204377.07271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 37031 1727204377.07305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 37031 1727204377.07427: Loaded config def from plugin (become/runas) 37031 1727204377.07430: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 37031 1727204377.07649: Loaded config def from plugin (become/su) 37031 1727204377.07651: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 37031 1727204377.07813: Loaded config def from plugin (become/sudo) 37031 1727204377.07815: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 37031 1727204377.07850: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 37031 1727204377.08600: in VariableManager get_vars() 37031 1727204377.08622: done with get_vars() 37031 1727204377.08961: trying /usr/local/lib/python3.12/site-packages/ansible/modules 37031 1727204377.16001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 37031 1727204377.16133: in VariableManager get_vars() 37031 1727204377.16138: done with get_vars() 37031 1727204377.16141: variable 'playbook_dir' from source: magic vars 37031 1727204377.16142: variable 'ansible_playbook_python' from source: magic vars 37031 1727204377.16142: variable 'ansible_config_file' from source: magic vars 37031 1727204377.16143: variable 'groups' from source: magic vars 37031 1727204377.16144: variable 'omit' from source: magic vars 37031 1727204377.16145: variable 'ansible_version' from source: magic vars 37031 1727204377.16146: variable 'ansible_check_mode' from source: magic vars 37031 1727204377.16147: variable 'ansible_diff_mode' from source: magic vars 37031 1727204377.16147: variable 'ansible_forks' from source: magic vars 37031 1727204377.16148: variable 'ansible_inventory_sources' from source: magic vars 37031 1727204377.16149: variable 'ansible_skip_tags' from source: magic vars 37031 1727204377.16150: variable 'ansible_limit' from source: magic vars 37031 1727204377.16150: variable 'ansible_run_tags' from source: magic vars 37031 1727204377.16151: variable 'ansible_verbosity' from source: magic vars 37031 1727204377.16190: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 37031 1727204377.17765: in VariableManager get_vars() 37031 1727204377.18692: done with get_vars() 37031 1727204377.18739: in VariableManager get_vars() 37031 1727204377.18757: done with get_vars() 37031 1727204377.19055: in VariableManager get_vars() 37031 1727204377.19070: done with get_vars() 37031 1727204377.19075: variable 'omit' from source: magic vars 37031 1727204377.19092: variable 'omit' from source: magic vars 37031 1727204377.19125: in VariableManager get_vars() 37031 1727204377.19134: done with get_vars() 37031 1727204377.19731: in VariableManager get_vars() 37031 1727204377.19745: done with get_vars() 37031 1727204377.19787: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 37031 1727204377.20647: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 37031 1727204377.20905: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 37031 1727204377.22749: in VariableManager get_vars() 37031 1727204377.22777: done with get_vars() 37031 1727204377.23817: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 37031 1727204377.24106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 37031 1727204377.29409: in VariableManager get_vars() 37031 1727204377.29430: done with get_vars() 37031 1727204377.30478: in VariableManager get_vars() 37031 1727204377.30518: done with get_vars() 37031 1727204377.32276: in VariableManager get_vars() 37031 1727204377.32297: done with get_vars() 37031 1727204377.32302: variable 'omit' from source: magic vars 37031 1727204377.32314: variable 'omit' from source: magic vars 37031 1727204377.32348: in VariableManager get_vars() 37031 1727204377.33371: done with get_vars() 37031 1727204377.33395: in VariableManager get_vars() 37031 1727204377.33412: done with get_vars() 37031 1727204377.33443: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 37031 1727204377.33569: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 37031 1727204377.33667: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 37031 1727204377.34584: in VariableManager get_vars() 37031 1727204377.34607: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 37031 1727204377.38823: in VariableManager get_vars() 37031 1727204377.38849: done with get_vars() 37031 1727204377.39174: in VariableManager get_vars() 37031 1727204377.39212: done with get_vars() 37031 1727204377.40400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 37031 1727204377.40415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 37031 1727204377.41366: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 37031 1727204377.41534: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 37031 1727204377.41537: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 37031 1727204377.41576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 37031 1727204377.41603: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 37031 1727204377.41778: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 37031 1727204377.41839: Loaded config def from plugin (callback/default) 37031 1727204377.41842: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 37031 1727204377.44308: Loaded config def from plugin (callback/junit) 37031 1727204377.44311: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 37031 1727204377.44368: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 37031 1727204377.44438: Loaded config def from plugin (callback/minimal) 37031 1727204377.44441: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 37031 1727204377.44488: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 37031 1727204377.44550: Loaded config def from plugin (callback/tree) 37031 1727204377.44555: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 37031 1727204377.45607: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 37031 1727204377.45611: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 37031 1727204377.45643: in VariableManager get_vars() 37031 1727204377.45662: done with get_vars() 37031 1727204377.45672: in VariableManager get_vars() 37031 1727204377.45682: done with get_vars() 37031 1727204377.45686: variable 'omit' from source: magic vars 37031 1727204377.45723: in VariableManager get_vars() 37031 1727204377.45737: done with get_vars() 37031 1727204377.45762: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 37031 1727204377.53306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 37031 1727204377.53410: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 37031 1727204377.54169: getting the remaining hosts for this loop 37031 1727204377.54172: done getting the remaining hosts for this loop 37031 1727204377.54176: getting the next task for host managed-node2 37031 1727204377.54180: done getting next task for host managed-node2 37031 1727204377.54182: ^ task is: TASK: Gathering Facts 37031 1727204377.54185: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204377.54187: getting variables 37031 1727204377.54188: in VariableManager get_vars() 37031 1727204377.54203: Calling all_inventory to load vars for managed-node2 37031 1727204377.54205: Calling groups_inventory to load vars for managed-node2 37031 1727204377.54208: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204377.54222: Calling all_plugins_play to load vars for managed-node2 37031 1727204377.54233: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204377.54236: Calling groups_plugins_play to load vars for managed-node2 37031 1727204377.54278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204377.54336: done with get_vars() 37031 1727204377.54345: done getting variables 37031 1727204377.54422: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Tuesday 24 September 2024 14:59:37 -0400 (0:00:00.089) 0:00:00.089 ***** 37031 1727204377.54449: entering _queue_task() for managed-node2/gather_facts 37031 1727204377.54450: Creating lock for gather_facts 37031 1727204377.55015: worker is 1 (out of 1 available) 37031 1727204377.55026: exiting _queue_task() for managed-node2/gather_facts 37031 1727204377.55041: done queuing things up, now waiting for results queue to drain 37031 1727204377.55042: waiting for pending results... 37031 1727204377.55758: running TaskExecutor() for managed-node2/TASK: Gathering Facts 37031 1727204377.55957: in run() - task 0affcd87-79f5-b754-dfb8-0000000000b9 37031 1727204377.55979: variable 'ansible_search_path' from source: unknown 37031 1727204377.56019: calling self._execute() 37031 1727204377.56204: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204377.56215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204377.56227: variable 'omit' from source: magic vars 37031 1727204377.56436: variable 'omit' from source: magic vars 37031 1727204377.56476: variable 'omit' from source: magic vars 37031 1727204377.56605: variable 'omit' from source: magic vars 37031 1727204377.56652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204377.56712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204377.56811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204377.56834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204377.56851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204377.57011: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204377.57019: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204377.57027: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204377.57243: Set connection var ansible_connection to ssh 37031 1727204377.57251: Set connection var ansible_shell_type to sh 37031 1727204377.57268: Set connection var ansible_pipelining to False 37031 1727204377.57280: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204377.57290: Set connection var ansible_timeout to 10 37031 1727204377.57299: Set connection var ansible_shell_executable to /bin/sh 37031 1727204377.57332: variable 'ansible_shell_executable' from source: unknown 37031 1727204377.57340: variable 'ansible_connection' from source: unknown 37031 1727204377.57347: variable 'ansible_module_compression' from source: unknown 37031 1727204377.57355: variable 'ansible_shell_type' from source: unknown 37031 1727204377.57361: variable 'ansible_shell_executable' from source: unknown 37031 1727204377.57442: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204377.57450: variable 'ansible_pipelining' from source: unknown 37031 1727204377.57460: variable 'ansible_timeout' from source: unknown 37031 1727204377.57472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204377.57876: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204377.57892: variable 'omit' from source: magic vars 37031 1727204377.57900: starting attempt loop 37031 1727204377.57906: running the handler 37031 1727204377.57923: variable 'ansible_facts' from source: unknown 37031 1727204377.57947: _low_level_execute_command(): starting 37031 1727204377.57963: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204377.61072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204377.61109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204377.61112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204377.61116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204377.61194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204377.61198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204377.61205: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204377.61261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204377.62931: stdout chunk (state=3): >>>/root <<< 37031 1727204377.63019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204377.63107: stderr chunk (state=3): >>><<< 37031 1727204377.63111: stdout chunk (state=3): >>><<< 37031 1727204377.63233: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204377.63237: _low_level_execute_command(): starting 37031 1727204377.63240: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325 `" && echo ansible-tmp-1727204377.6313155-37183-257486945022325="` echo /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325 `" ) && sleep 0' 37031 1727204377.64672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204377.64676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204377.64711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204377.64714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204377.64723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204377.64903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204377.64907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204377.64975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204377.65149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204377.67009: stdout chunk (state=3): >>>ansible-tmp-1727204377.6313155-37183-257486945022325=/root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325 <<< 37031 1727204377.67191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204377.67195: stdout chunk (state=3): >>><<< 37031 1727204377.67201: stderr chunk (state=3): >>><<< 37031 1727204377.67479: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204377.6313155-37183-257486945022325=/root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204377.67483: variable 'ansible_module_compression' from source: unknown 37031 1727204377.67485: ANSIBALLZ: Using generic lock for ansible.legacy.setup 37031 1727204377.67488: ANSIBALLZ: Acquiring lock 37031 1727204377.67490: ANSIBALLZ: Lock acquired: 140694173153808 37031 1727204377.67492: ANSIBALLZ: Creating module 37031 1727204378.29543: ANSIBALLZ: Writing module into payload 37031 1727204378.29795: ANSIBALLZ: Writing module 37031 1727204378.29839: ANSIBALLZ: Renaming module 37031 1727204378.29852: ANSIBALLZ: Done creating module 37031 1727204378.29900: variable 'ansible_facts' from source: unknown 37031 1727204378.29916: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204378.29929: _low_level_execute_command(): starting 37031 1727204378.29939: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 37031 1727204378.30636: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204378.30650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.30672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.30690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.30733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.30744: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204378.30760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.30784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204378.30795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204378.30805: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204378.30815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.30827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.30844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.30857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.30870: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204378.30884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.30967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204378.30985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204378.31002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204378.31092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204378.32745: stdout chunk (state=3): >>>PLATFORM <<< 37031 1727204378.32846: stdout chunk (state=3): >>>Linux <<< 37031 1727204378.32855: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 37031 1727204378.32995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204378.33088: stderr chunk (state=3): >>><<< 37031 1727204378.33091: stdout chunk (state=3): >>><<< 37031 1727204378.33113: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204378.33124 [managed-node2]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 37031 1727204378.33172: _low_level_execute_command(): starting 37031 1727204378.33175: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 37031 1727204378.33292: Sending initial data 37031 1727204378.33295: Sent initial data (1181 bytes) 37031 1727204378.33883: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204378.33894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.33911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.33925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.33968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.33977: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204378.33987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.34001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204378.34016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204378.34023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204378.34031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.34041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.34055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.34058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.34068: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204378.34078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.34166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204378.34173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204378.34176: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204378.34249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204378.38547: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 37031 1727204378.39175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204378.39180: stdout chunk (state=3): >>><<< 37031 1727204378.39184: stderr chunk (state=3): >>><<< 37031 1727204378.39196: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204378.39257: variable 'ansible_facts' from source: unknown 37031 1727204378.39261: variable 'ansible_facts' from source: unknown 37031 1727204378.39268: variable 'ansible_module_compression' from source: unknown 37031 1727204378.39305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 37031 1727204378.39341: variable 'ansible_facts' from source: unknown 37031 1727204378.39471: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/AnsiballZ_setup.py 37031 1727204378.39619: Sending initial data 37031 1727204378.39622: Sent initial data (154 bytes) 37031 1727204378.40559: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204378.40570: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.40584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.40592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.40674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.40678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.40693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.40722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204378.40726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204378.40794: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204378.43205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204378.43245: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204378.43289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpudud6ntj /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/AnsiballZ_setup.py <<< 37031 1727204378.43333: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204378.46067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204378.46089: stderr chunk (state=3): >>><<< 37031 1727204378.46092: stdout chunk (state=3): >>><<< 37031 1727204378.46105: done transferring module to remote 37031 1727204378.46213: _low_level_execute_command(): starting 37031 1727204378.46217: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/ /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/AnsiballZ_setup.py && sleep 0' 37031 1727204378.47363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204378.47872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.47876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.47922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.47926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.47928: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204378.47938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.47953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204378.47963: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204378.47977: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204378.47984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.47993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.48004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.48011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.48017: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204378.48026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.48104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204378.48119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204378.48122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204378.48672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204378.50442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204378.50447: stdout chunk (state=3): >>><<< 37031 1727204378.50453: stderr chunk (state=3): >>><<< 37031 1727204378.50479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204378.50482: _low_level_execute_command(): starting 37031 1727204378.50486: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/AnsiballZ_setup.py && sleep 0' 37031 1727204378.51550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204378.52085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.52095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.52109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.52150: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.52159: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204378.52174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.52187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204378.52380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204378.52386: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204378.52394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204378.52403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204378.52414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204378.52421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204378.52428: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204378.52438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204378.52515: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204378.52534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204378.52546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204378.52628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204378.54555: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 37031 1727204378.54564: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 37031 1727204378.54625: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 37031 1727204378.54660: stdout chunk (state=3): >>>import 'posix' # <<< 37031 1727204378.54693: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 37031 1727204378.54737: stdout chunk (state=3): >>>import 'time' # <<< 37031 1727204378.54741: stdout chunk (state=3): >>>import 'zipimport' # <<< 37031 1727204378.54743: stdout chunk (state=3): >>> # installed zipimport hook <<< 37031 1727204378.54796: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 37031 1727204378.54809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.54814: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 37031 1727204378.54839: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 37031 1727204378.54842: stdout chunk (state=3): >>>import '_codecs' # <<< 37031 1727204378.54876: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98dc0> <<< 37031 1727204378.54923: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 37031 1727204378.54927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 37031 1727204378.54929: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98b20> <<< 37031 1727204378.54964: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 37031 1727204378.54977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 37031 1727204378.54980: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98ac0> <<< 37031 1727204378.55020: stdout chunk (state=3): >>>import '_signal' # <<< 37031 1727204378.55023: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 37031 1727204378.55035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 37031 1727204378.55040: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d490> <<< 37031 1727204378.55071: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 37031 1727204378.55102: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 37031 1727204378.55117: stdout chunk (state=3): >>>import '_abc' # <<< 37031 1727204378.55125: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d940> <<< 37031 1727204378.55144: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d670> <<< 37031 1727204378.55174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 37031 1727204378.55180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 37031 1727204378.55215: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 37031 1727204378.55219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 37031 1727204378.55259: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 37031 1727204378.55267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 37031 1727204378.55292: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf190> <<< 37031 1727204378.55325: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 37031 1727204378.55332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 37031 1727204378.55408: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf220> <<< 37031 1727204378.55435: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 37031 1727204378.55439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 37031 1727204378.55476: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py <<< 37031 1727204378.55480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbf2850> <<< 37031 1727204378.55482: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf940> <<< 37031 1727204378.55525: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de55880> <<< 37031 1727204378.55531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 37031 1727204378.55563: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbc8d90> <<< 37031 1727204378.55613: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py <<< 37031 1727204378.55616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 37031 1727204378.55619: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbf2d90> <<< 37031 1727204378.55685: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d970> <<< 37031 1727204378.55708: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 37031 1727204378.56038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 37031 1727204378.56068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 37031 1727204378.56083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 37031 1727204378.56086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 37031 1727204378.56106: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 37031 1727204378.56126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 37031 1727204378.56155: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 37031 1727204378.56167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 37031 1727204378.56171: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db92f10> <<< 37031 1727204378.56229: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db990a0> <<< 37031 1727204378.56248: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 37031 1727204378.56268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 37031 1727204378.56271: stdout chunk (state=3): >>>import '_sre' # <<< 37031 1727204378.56297: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 37031 1727204378.56303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 37031 1727204378.56328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 37031 1727204378.56333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 37031 1727204378.56377: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db8c5b0> <<< 37031 1727204378.56381: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db936a0> <<< 37031 1727204378.56384: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db923d0> <<< 37031 1727204378.56404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 37031 1727204378.56480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 37031 1727204378.56498: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 37031 1727204378.56548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.56555: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 37031 1727204378.56558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 37031 1727204378.56605: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.56609: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3da7ae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7a940> <<< 37031 1727204378.56631: stdout chunk (state=3): >>>import 'itertools' # <<< 37031 1727204378.56635: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py <<< 37031 1727204378.56637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7af40> <<< 37031 1727204378.56676: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 37031 1727204378.56681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 37031 1727204378.56731: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7ad90> <<< 37031 1727204378.56735: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 37031 1727204378.56737: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 37031 1727204378.56752: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8b100> <<< 37031 1727204378.56757: stdout chunk (state=3): >>>import '_collections' # <<< 37031 1727204378.56804: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db6edc0> <<< 37031 1727204378.56810: stdout chunk (state=3): >>>import '_functools' # <<< 37031 1727204378.56839: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db676a0> <<< 37031 1727204378.56894: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 37031 1727204378.56900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db7a700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db9aeb0> <<< 37031 1727204378.56926: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py <<< 37031 1727204378.56929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 37031 1727204378.56961: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.56970: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3da8bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db6e2e0> <<< 37031 1727204378.57007: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.57014: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3db7a310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dba0a60> <<< 37031 1727204378.57074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 37031 1727204378.57088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 37031 1727204378.57103: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 37031 1727204378.57121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.57124: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 37031 1727204378.57127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 37031 1727204378.57129: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8bee0> <<< 37031 1727204378.57131: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8be20> <<< 37031 1727204378.57160: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py <<< 37031 1727204378.57167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8bd90> <<< 37031 1727204378.57193: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 37031 1727204378.57220: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 37031 1727204378.57223: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 37031 1727204378.57258: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 37031 1727204378.57302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 37031 1727204378.57342: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da5e400> <<< 37031 1727204378.57359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 37031 1727204378.57372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 37031 1727204378.57404: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da5e4f0> <<< 37031 1727204378.57523: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da93f70> <<< 37031 1727204378.57565: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8dac0> <<< 37031 1727204378.57571: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8d490> <<< 37031 1727204378.57597: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 37031 1727204378.57600: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 37031 1727204378.57649: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 37031 1727204378.57657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 37031 1727204378.57692: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 37031 1727204378.57697: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d992250> <<< 37031 1727204378.57732: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da49550> <<< 37031 1727204378.57782: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8df40> <<< 37031 1727204378.57787: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dba00d0> <<< 37031 1727204378.57824: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 37031 1727204378.57832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 37031 1727204378.57872: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 37031 1727204378.57876: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9a4b80> <<< 37031 1727204378.57878: stdout chunk (state=3): >>>import 'errno' # <<< 37031 1727204378.57925: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d9a4eb0> <<< 37031 1727204378.57940: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 37031 1727204378.57965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 37031 1727204378.57983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 37031 1727204378.57985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 37031 1727204378.57988: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b57c0> <<< 37031 1727204378.58013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 37031 1727204378.58042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 37031 1727204378.58072: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b5d00> <<< 37031 1727204378.58107: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d94f430> <<< 37031 1727204378.58111: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9a4fa0> <<< 37031 1727204378.58140: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 37031 1727204378.58145: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 37031 1727204378.58194: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d95f310> <<< 37031 1727204378.58209: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b5640> <<< 37031 1727204378.58215: stdout chunk (state=3): >>>import 'pwd' # <<< 37031 1727204378.58250: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d95f3d0> <<< 37031 1727204378.58298: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8ba60> <<< 37031 1727204378.58314: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 37031 1727204378.58321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 37031 1727204378.58351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 37031 1727204378.58357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 37031 1727204378.58411: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97b730> <<< 37031 1727204378.58419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 37031 1727204378.58422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 37031 1727204378.58458: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97b7f0> <<< 37031 1727204378.58486: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97b8e0> <<< 37031 1727204378.58515: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 37031 1727204378.58521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 37031 1727204378.58713: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.58718: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97bd30> <<< 37031 1727204378.58749: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.58757: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d985280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97b970> <<< 37031 1727204378.58782: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d96eac0> <<< 37031 1727204378.58803: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8b640> <<< 37031 1727204378.58827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 37031 1727204378.58890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 37031 1727204378.58923: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97bb20> <<< 37031 1727204378.59068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 37031 1727204378.59089: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1f3d8a3700> <<< 37031 1727204378.59312: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip' <<< 37031 1727204378.59319: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.59408: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.59437: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 37031 1727204378.59451: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.59461: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.59474: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 37031 1727204378.59495: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.60727: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.61688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2850> <<< 37031 1727204378.61694: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 37031 1727204378.61696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.61718: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 37031 1727204378.61741: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 37031 1727204378.61781: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.61784: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7e2160> <<< 37031 1727204378.61816: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2280> <<< 37031 1727204378.61843: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2fa0> <<< 37031 1727204378.61859: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 37031 1727204378.61874: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 37031 1727204378.61921: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e24f0> <<< 37031 1727204378.61924: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2dc0> import 'atexit' # <<< 37031 1727204378.61957: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7e2580> <<< 37031 1727204378.61979: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 37031 1727204378.62006: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 37031 1727204378.62048: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2100> <<< 37031 1727204378.62060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 37031 1727204378.62082: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 37031 1727204378.62097: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 37031 1727204378.62124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 37031 1727204378.62151: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 37031 1727204378.62234: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7a1f70> <<< 37031 1727204378.62274: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d188370> <<< 37031 1727204378.62303: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d188070> <<< 37031 1727204378.62323: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 37031 1727204378.62336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 37031 1727204378.62377: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d188cd0> <<< 37031 1727204378.62389: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7cadc0> <<< 37031 1727204378.62553: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ca3a0> <<< 37031 1727204378.62577: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 37031 1727204378.62601: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7caf40> <<< 37031 1727204378.62620: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 37031 1727204378.62632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 37031 1727204378.62669: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 37031 1727204378.62689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 37031 1727204378.62709: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 37031 1727204378.62730: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 37031 1727204378.62733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d817f40> <<< 37031 1727204378.62815: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ead60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ea430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d795af0> <<< 37031 1727204378.62854: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7ea550> <<< 37031 1727204378.62885: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ea580> <<< 37031 1727204378.62911: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 37031 1727204378.62925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 37031 1727204378.62939: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 37031 1727204378.62979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 37031 1727204378.63050: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.63053: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1f6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829280> <<< 37031 1727204378.63072: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 37031 1727204378.63086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 37031 1727204378.63141: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.63144: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1f4820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829400> <<< 37031 1727204378.63170: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 37031 1727204378.63204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.63230: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 37031 1727204378.63243: stdout chunk (state=3): >>>import '_string' # <<< 37031 1727204378.63299: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829c40> <<< 37031 1727204378.63430: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f47c0> <<< 37031 1727204378.63527: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7c21c0> <<< 37031 1727204378.63566: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.63574: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d8299d0> <<< 37031 1727204378.63610: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.63613: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d829550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d822940> <<< 37031 1727204378.63643: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 37031 1727204378.63657: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 37031 1727204378.63679: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 37031 1727204378.63727: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1e8910> <<< 37031 1727204378.63916: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d739dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f3550> <<< 37031 1727204378.63954: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1e8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f3970> <<< 37031 1727204378.63983: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 37031 1727204378.63997: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 37031 1727204378.64008: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64084: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64170: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64174: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 37031 1727204378.64193: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64215: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64218: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 37031 1727204378.64229: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64324: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64420: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.64875: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.65323: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 37031 1727204378.65339: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 37031 1727204378.65384: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 37031 1727204378.65388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.65445: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.65448: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7627f0> <<< 37031 1727204378.65511: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 37031 1727204378.65515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7678b0> <<< 37031 1727204378.65528: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd89940> <<< 37031 1727204378.65588: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 37031 1727204378.65591: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.65606: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.65628: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py <<< 37031 1727204378.65631: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.65768: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.65888: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 37031 1727204378.65923: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7a0730> <<< 37031 1727204378.65926: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66316: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66692: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66746: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66820: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 37031 1727204378.66828: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66853: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66898: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 37031 1727204378.66903: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.66952: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67041: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 37031 1727204378.67066: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67070: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 37031 1727204378.67082: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67110: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67148: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 37031 1727204378.67336: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67523: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 37031 1727204378.67559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 37031 1727204378.67639: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e52e0> # zipimport: zlib available <<< 37031 1727204378.67705: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67789: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 37031 1727204378.67796: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 37031 1727204378.67812: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67849: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67891: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 37031 1727204378.67894: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67930: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.67976: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68066: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68127: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 37031 1727204378.68152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.68227: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d759880> <<< 37031 1727204378.68316: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cc1dac0> <<< 37031 1727204378.68366: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py <<< 37031 1727204378.68370: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68417: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68474: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68496: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68540: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 37031 1727204378.68559: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 37031 1727204378.68577: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 37031 1727204378.68623: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 37031 1727204378.68659: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 37031 1727204378.68726: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d76a910> <<< 37031 1727204378.68775: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7b4970> <<< 37031 1727204378.68828: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d79e850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 37031 1727204378.68858: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.68873: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.69386: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available <<< 37031 1727204378.69469: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.69556: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.69588: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.69622: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py <<< 37031 1727204378.69627: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.69848: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.70074: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.70119: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.70184: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.70213: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 37031 1727204378.70217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 37031 1727204378.70250: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 37031 1727204378.70290: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cb0ac70> <<< 37031 1727204378.70318: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 37031 1727204378.70324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 37031 1727204378.70343: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 37031 1727204378.70382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 37031 1727204378.70404: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 37031 1727204378.70425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 37031 1727204378.70429: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd6aa30> <<< 37031 1727204378.70475: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so'<<< 37031 1727204378.70543: stdout chunk (state=3): >>> <<< 37031 1727204378.70551: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cd6a9a0> <<< 37031 1727204378.70637: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cdb2b20> <<< 37031 1727204378.70659: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cdb2550> <<< 37031 1727204378.70692: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e2e0> <<< 37031 1727204378.70698: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e970> <<< 37031 1727204378.70718: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 37031 1727204378.70742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 37031 1727204378.70771: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 37031 1727204378.70776: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 37031 1727204378.70808: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.70811: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cd4f2b0> <<< 37031 1727204378.70829: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd4fa00> <<< 37031 1727204378.70855: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 37031 1727204378.70865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 37031 1727204378.70894: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd4f940> <<< 37031 1727204378.70914: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 37031 1727204378.70935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 37031 1727204378.70992: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.70996: stdout chunk (state=3): >>>import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cb6b0d0> <<< 37031 1727204378.71017: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7553a0> <<< 37031 1727204378.71049: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 37031 1727204378.71060: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 37031 1727204378.71097: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71104: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71124: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available <<< 37031 1727204378.71182: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71255: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 37031 1727204378.71317: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71421: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 37031 1727204378.71455: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71491: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71504: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 37031 1727204378.71571: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71621: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 37031 1727204378.71634: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71683: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71728: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 37031 1727204378.71815: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71879: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.71949: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.72036: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 37031 1727204378.72048: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.72585: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.72951: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 37031 1727204378.73001: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73046: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73072: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73129: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 37031 1727204378.73133: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73159: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73176: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 37031 1727204378.73225: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73292: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 37031 1727204378.73296: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73336: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73351: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 37031 1727204378.73385: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73403: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 37031 1727204378.73460: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.73545: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 37031 1727204378.73590: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca52eb0> <<< 37031 1727204378.73593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 37031 1727204378.73605: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 37031 1727204378.74198: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca529d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 37031 1727204378.74702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 37031 1727204378.74803: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.74807: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cac6bb0> <<< 37031 1727204378.75229: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca6aa60> <<< 37031 1727204378.75261: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 37031 1727204378.75307: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.75378: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.75460: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 37031 1727204378.75491: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.75621: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.75749: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.75908: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76127: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 37031 1727204378.76159: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 37031 1727204378.76198: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76252: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76336: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 37031 1727204378.76350: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76407: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76494: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 37031 1727204378.76508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 37031 1727204378.76595: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204378.76648: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cacd040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cacd6d0> <<< 37031 1727204378.76684: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 37031 1727204378.76732: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76775: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 37031 1727204378.76790: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76840: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.76935: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 37031 1727204378.76938: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.77167: stdout chunk (state=3): >>># zipimport: zlib available<<< 37031 1727204378.77171: stdout chunk (state=3): >>> <<< 37031 1727204378.77369: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 37031 1727204378.77400: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.77543: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.77693: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.77752: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.77834: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 37031 1727204378.77848: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 37031 1727204378.77881: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78011: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78052: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78252: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78463: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py <<< 37031 1727204378.78494: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 37031 1727204378.78506: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78681: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78848: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 37031 1727204378.78879: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78936: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.78997: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.79749: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80302: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 37031 1727204378.80318: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80392: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80493: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 37031 1727204378.80508: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80577: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80668: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 37031 1727204378.80796: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.80949: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 37031 1727204378.81002: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 37031 1727204378.81007: stdout chunk (state=3): >>>import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 37031 1727204378.81020: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81080: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 37031 1727204378.81083: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81150: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81235: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81411: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81591: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 37031 1727204378.81594: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81657: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81698: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 37031 1727204378.81710: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.81723: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 37031 1727204378.81801: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.82337: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 37031 1727204378.82563: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83088: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 37031 1727204378.83122: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83189: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 37031 1727204378.83209: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83266: stdout chunk (state=3): >>># zipimport: zlib available<<< 37031 1727204378.83269: stdout chunk (state=3): >>> <<< 37031 1727204378.83335: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 37031 1727204378.83384: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83462: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 37031 1727204378.83468: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83569: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83709: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 37031 1727204378.83732: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83775: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 37031 1727204378.83788: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83835: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83925: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 37031 1727204378.83928: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.83961: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84003: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84068: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84139: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84242: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84431: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 37031 1727204378.84474: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 37031 1727204378.84494: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84560: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py<<< 37031 1727204378.84568: stdout chunk (state=3): >>> <<< 37031 1727204378.84585: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.84793: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.85129: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available <<< 37031 1727204378.85219: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available <<< 37031 1727204378.85314: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 37031 1727204378.85393: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.85426: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 37031 1727204378.85434: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.85524: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 37031 1727204378.85527: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 37031 1727204378.85653: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204378.86333: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 37031 1727204378.86341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 37031 1727204378.86369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 37031 1727204378.86385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 37031 1727204378.86669: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3ca50310> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca50460> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c9fbac0> <<< 37031 1727204378.87592: stdout chunk (state=3): >>>import 'gc' # <<< 37031 1727204378.95399: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 37031 1727204378.95443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 37031 1727204378.95472: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca50550> <<< 37031 1727204378.95513: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 37031 1727204378.95543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 37031 1727204378.95581: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca125b0> <<< 37031 1727204378.95650: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py <<< 37031 1727204378.95676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204378.95733: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 37031 1727204378.95740: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' <<< 37031 1727204378.95761: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c863a60> <<< 37031 1727204378.95790: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c863640> <<< 37031 1727204378.96168: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 37031 1727204378.96174: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 37031 1727204379.16349: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.76, "5m": 0.6, "15m": 0.31}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "59", "second": "38", "epoch": "1727204378", "epoch_int": "1727204378", "date": "2024-09-24", "time": "14:59:38", "iso8601_micro": "2024-09-24T18:59:38.873380Z", "iso8601": "2024-09-24T18:59:38Z", "iso8601_basic": "20240924T145938873380", "iso8601_basic_short": "20240924T145938", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [],<<< 37031 1727204379.16402: stdout chunk (state=3): >>> "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]"<<< 37031 1727204379.16406: stdout chunk (state=3): >>>, "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2778, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 754, "free": 2778}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, <<< 37031 1727204379.16412: stdout chunk (state=3): >>>"sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271024128, "block_size": 4096, "block_total": 65519355, "block_available": 64519293, "block_used": 1000062, "inode_total": 131071472, "inode_available": 130998224, "inode_used": 73248, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 37031 1727204379.17022: stdout chunk (state=3): >>># clear builtins._ <<< 37031 1727204379.17089: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 37031 1727204379.17176: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 37031 1727204379.17182: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 37031 1727204379.17185: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 37031 1727204379.17308: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale <<< 37031 1727204379.17332: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq<<< 37031 1727204379.17335: stdout chunk (state=3): >>> # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib <<< 37031 1727204379.17338: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 37031 1727204379.17340: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 37031 1727204379.17342: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch <<< 37031 1727204379.17345: stdout chunk (state=3): >>># cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2<<< 37031 1727204379.17347: stdout chunk (state=3): >>> # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect<<< 37031 1727204379.17349: stdout chunk (state=3): >>> # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile<<< 37031 1727204379.17351: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible <<< 37031 1727204379.17357: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal <<< 37031 1727204379.17359: stdout chunk (state=3): >>># cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize <<< 37031 1727204379.17361: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 37031 1727204379.17363: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid<<< 37031 1727204379.17365: stdout chunk (state=3): >>> # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common <<< 37031 1727204379.17372: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors <<< 37031 1727204379.17375: stdout chunk (state=3): >>># destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 37031 1727204379.17377: stdout chunk (state=3): >>># cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info <<< 37031 1727204379.17379: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 37031 1727204379.17381: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps <<< 37031 1727204379.17383: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime <<< 37031 1727204379.17385: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser<<< 37031 1727204379.17387: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python<<< 37031 1727204379.17389: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd <<< 37031 1727204379.17391: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux <<< 37031 1727204379.17394: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector <<< 37031 1727204379.17396: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps <<< 37031 1727204379.17398: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user<<< 37031 1727204379.17400: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly <<< 37031 1727204379.17402: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd <<< 37031 1727204379.17404: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly <<< 37031 1727204379.17406: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep <<< 37031 1727204379.17408: stdout chunk (state=3): >>># cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection<<< 37031 1727204379.17410: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.dummy <<< 37031 1727204379.17611: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 37031 1727204379.17618: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 37031 1727204379.17665: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 37031 1727204379.17671: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 37031 1727204379.17703: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 37031 1727204379.17706: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 37031 1727204379.17709: stdout chunk (state=3): >>># destroy _json # destroy encodings <<< 37031 1727204379.17740: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 37031 1727204379.17781: stdout chunk (state=3): >>># destroy selinux <<< 37031 1727204379.17787: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 37031 1727204379.17828: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 37031 1727204379.17832: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues <<< 37031 1727204379.17835: stdout chunk (state=3): >>># destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 37031 1727204379.17859: stdout chunk (state=3): >>># destroy queue <<< 37031 1727204379.17862: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 37031 1727204379.17899: stdout chunk (state=3): >>># destroy shlex <<< 37031 1727204379.17918: stdout chunk (state=3): >>># destroy datetime <<< 37031 1727204379.17932: stdout chunk (state=3): >>># destroy base64 <<< 37031 1727204379.17944: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 37031 1727204379.17949: stdout chunk (state=3): >>># destroy json <<< 37031 1727204379.17983: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 37031 1727204379.17986: stdout chunk (state=3): >>># destroy glob <<< 37031 1727204379.17989: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util <<< 37031 1727204379.17993: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 37031 1727204379.18097: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep <<< 37031 1727204379.18222: stdout chunk (state=3): >>># cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 37031 1727204379.18227: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes <<< 37031 1727204379.18230: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 37031 1727204379.18280: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 37031 1727204379.18284: stdout chunk (state=3): >>># destroy subprocess <<< 37031 1727204379.18287: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal <<< 37031 1727204379.18290: stdout chunk (state=3): >>># cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 37031 1727204379.18292: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 37031 1727204379.18294: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 37031 1727204379.18297: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 37031 1727204379.18299: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap <<< 37031 1727204379.18301: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re <<< 37031 1727204379.18303: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator <<< 37031 1727204379.18305: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 37031 1727204379.18307: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale <<< 37031 1727204379.18309: stdout chunk (state=3): >>># cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 37031 1727204379.18312: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 37031 1727204379.18314: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 37031 1727204379.18316: stdout chunk (state=3): >>># cleanup[3] wiping sys <<< 37031 1727204379.18318: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 37031 1727204379.18320: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl <<< 37031 1727204379.18322: stdout chunk (state=3): >>># destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket <<< 37031 1727204379.18324: stdout chunk (state=3): >>># destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 37031 1727204379.18441: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 37031 1727204379.18444: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize <<< 37031 1727204379.18472: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 37031 1727204379.18475: stdout chunk (state=3): >>># destroy stat <<< 37031 1727204379.18490: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 37031 1727204379.18518: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 37031 1727204379.18521: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 37031 1727204379.18570: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 37031 1727204379.19008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204379.19012: stdout chunk (state=3): >>><<< 37031 1727204379.19069: stderr chunk (state=3): >>><<< 37031 1727204379.19282: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de98ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbf2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de55880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dbf2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3de3d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db92f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db990a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db8c5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db936a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db923d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3da7ae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7a940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7af40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da7ad90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8b100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db6edc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db676a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db7a700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db9aeb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3da8bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3db6e2e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3db7a310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dba0a60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8bee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8be20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8bd90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da5e400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da5e4f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da93f70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8dac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d992250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da49550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8df40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3dba00d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9a4b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d9a4eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b57c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b5d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d94f430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9a4fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d95f310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d9b5640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d95f3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8ba60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97b730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97ba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97b7f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97b8e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d97bd30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d985280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97b970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d96eac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3da8b640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d97bb20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1f3d8a3700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7e2160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e24f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7e2580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e2100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7a1f70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d188370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d188070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d188cd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7cadc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ca3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7caf40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d817f40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ead60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ea430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d795af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7ea550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7ea580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1f6fa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1f4820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d829c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f47c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7c21c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d8299d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d829550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d822940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1e8910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d739dc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f3550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d1e8eb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d1f3970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d7627f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7678b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd89940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7a0730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7e52e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3d759880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cc1dac0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d76a910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7b4970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d79e850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cb0ac70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd6aa30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cd6a9a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cdb2b20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cdb2550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e2e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cd4f2b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd4fa00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd4f940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cb6b0d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3d7553a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cd9e670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca52eb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca529d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cac6bb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca6aa60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3cacd040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3cacd6d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_dpzuzker/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1f3ca50310> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca50460> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c9fbac0> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca50550> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3ca125b0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c863a60> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1f3c863640> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.76, "5m": 0.6, "15m": 0.31}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "59", "second": "38", "epoch": "1727204378", "epoch_int": "1727204378", "date": "2024-09-24", "time": "14:59:38", "iso8601_micro": "2024-09-24T18:59:38.873380Z", "iso8601": "2024-09-24T18:59:38Z", "iso8601_basic": "20240924T145938873380", "iso8601_basic_short": "20240924T145938", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2778, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 754, "free": 2778}, "nocache": {"free": 3254, "used": 278}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 741, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264271024128, "block_size": 4096, "block_total": 65519355, "block_available": 64519293, "block_used": 1000062, "inode_total": 131071472, "inode_available": 130998224, "inode_used": 73248, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 37031 1727204379.20474: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204379.20477: _low_level_execute_command(): starting 37031 1727204379.20479: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204377.6313155-37183-257486945022325/ > /dev/null 2>&1 && sleep 0' 37031 1727204379.21031: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204379.21047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.21069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.21088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.21131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.21144: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204379.21157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.21176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204379.21187: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204379.21196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204379.21207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.21219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.21234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.21245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.21255: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204379.21276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.21346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.21373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204379.21391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.21461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204379.23350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.23353: stdout chunk (state=3): >>><<< 37031 1727204379.23356: stderr chunk (state=3): >>><<< 37031 1727204379.23772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204379.23776: handler run complete 37031 1727204379.23778: variable 'ansible_facts' from source: unknown 37031 1727204379.23781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.23948: variable 'ansible_facts' from source: unknown 37031 1727204379.24043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.24184: attempt loop complete, returning result 37031 1727204379.24194: _execute() done 37031 1727204379.24206: dumping result to json 37031 1727204379.24242: done dumping result, returning 37031 1727204379.24256: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-b754-dfb8-0000000000b9] 37031 1727204379.24269: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b9 37031 1727204379.24768: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b9 ok: [managed-node2] 37031 1727204379.24884: no more pending results, returning what we have 37031 1727204379.24887: results queue empty 37031 1727204379.24888: checking for any_errors_fatal 37031 1727204379.24889: done checking for any_errors_fatal 37031 1727204379.24890: checking for max_fail_percentage 37031 1727204379.24892: done checking for max_fail_percentage 37031 1727204379.24892: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.24893: done checking to see if all hosts have failed 37031 1727204379.24894: getting the remaining hosts for this loop 37031 1727204379.24896: done getting the remaining hosts for this loop 37031 1727204379.24899: getting the next task for host managed-node2 37031 1727204379.24905: done getting next task for host managed-node2 37031 1727204379.24907: ^ task is: TASK: meta (flush_handlers) 37031 1727204379.24909: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.24912: getting variables 37031 1727204379.24914: in VariableManager get_vars() 37031 1727204379.24935: Calling all_inventory to load vars for managed-node2 37031 1727204379.24937: Calling groups_inventory to load vars for managed-node2 37031 1727204379.24940: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.24947: WORKER PROCESS EXITING 37031 1727204379.24959: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.24962: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.24968: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.25190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.25917: done with get_vars() 37031 1727204379.25928: done getting variables 37031 1727204379.25996: in VariableManager get_vars() 37031 1727204379.26011: Calling all_inventory to load vars for managed-node2 37031 1727204379.26014: Calling groups_inventory to load vars for managed-node2 37031 1727204379.26016: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.26021: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.26023: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.26030: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.26192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.26539: done with get_vars() 37031 1727204379.26677: done queuing things up, now waiting for results queue to drain 37031 1727204379.26680: results queue empty 37031 1727204379.26681: checking for any_errors_fatal 37031 1727204379.26683: done checking for any_errors_fatal 37031 1727204379.26684: checking for max_fail_percentage 37031 1727204379.26685: done checking for max_fail_percentage 37031 1727204379.26686: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.26687: done checking to see if all hosts have failed 37031 1727204379.26687: getting the remaining hosts for this loop 37031 1727204379.26688: done getting the remaining hosts for this loop 37031 1727204379.26691: getting the next task for host managed-node2 37031 1727204379.26696: done getting next task for host managed-node2 37031 1727204379.26698: ^ task is: TASK: Include the task 'el_repo_setup.yml' 37031 1727204379.26700: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.26702: getting variables 37031 1727204379.26703: in VariableManager get_vars() 37031 1727204379.26711: Calling all_inventory to load vars for managed-node2 37031 1727204379.26713: Calling groups_inventory to load vars for managed-node2 37031 1727204379.26715: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.26720: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.26722: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.26725: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.26975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.27360: done with get_vars() 37031 1727204379.27372: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Tuesday 24 September 2024 14:59:39 -0400 (0:00:01.730) 0:00:01.819 ***** 37031 1727204379.27495: entering _queue_task() for managed-node2/include_tasks 37031 1727204379.27498: Creating lock for include_tasks 37031 1727204379.27886: worker is 1 (out of 1 available) 37031 1727204379.27898: exiting _queue_task() for managed-node2/include_tasks 37031 1727204379.27910: done queuing things up, now waiting for results queue to drain 37031 1727204379.27911: waiting for pending results... 37031 1727204379.28267: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 37031 1727204379.28327: in run() - task 0affcd87-79f5-b754-dfb8-000000000006 37031 1727204379.28337: variable 'ansible_search_path' from source: unknown 37031 1727204379.28387: calling self._execute() 37031 1727204379.28432: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.28436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.28445: variable 'omit' from source: magic vars 37031 1727204379.28519: _execute() done 37031 1727204379.28523: dumping result to json 37031 1727204379.28527: done dumping result, returning 37031 1727204379.28530: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-b754-dfb8-000000000006] 37031 1727204379.28535: sending task result for task 0affcd87-79f5-b754-dfb8-000000000006 37031 1727204379.28631: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000006 37031 1727204379.28633: WORKER PROCESS EXITING 37031 1727204379.28671: no more pending results, returning what we have 37031 1727204379.28676: in VariableManager get_vars() 37031 1727204379.28708: Calling all_inventory to load vars for managed-node2 37031 1727204379.28710: Calling groups_inventory to load vars for managed-node2 37031 1727204379.28713: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.28727: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.28730: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.28733: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.28891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.28997: done with get_vars() 37031 1727204379.29003: variable 'ansible_search_path' from source: unknown 37031 1727204379.29013: we have included files to process 37031 1727204379.29014: generating all_blocks data 37031 1727204379.29015: done generating all_blocks data 37031 1727204379.29016: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 37031 1727204379.29016: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 37031 1727204379.29018: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 37031 1727204379.29582: in VariableManager get_vars() 37031 1727204379.29597: done with get_vars() 37031 1727204379.29608: done processing included file 37031 1727204379.29610: iterating over new_blocks loaded from include file 37031 1727204379.29611: in VariableManager get_vars() 37031 1727204379.29619: done with get_vars() 37031 1727204379.29621: filtering new block on tags 37031 1727204379.29635: done filtering new block on tags 37031 1727204379.29637: in VariableManager get_vars() 37031 1727204379.29646: done with get_vars() 37031 1727204379.29647: filtering new block on tags 37031 1727204379.29665: done filtering new block on tags 37031 1727204379.29669: in VariableManager get_vars() 37031 1727204379.29680: done with get_vars() 37031 1727204379.29681: filtering new block on tags 37031 1727204379.29698: done filtering new block on tags 37031 1727204379.29701: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 37031 1727204379.29707: extending task lists for all hosts with included blocks 37031 1727204379.29777: done extending task lists 37031 1727204379.29779: done processing included files 37031 1727204379.29779: results queue empty 37031 1727204379.29780: checking for any_errors_fatal 37031 1727204379.29781: done checking for any_errors_fatal 37031 1727204379.29782: checking for max_fail_percentage 37031 1727204379.29783: done checking for max_fail_percentage 37031 1727204379.29784: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.29785: done checking to see if all hosts have failed 37031 1727204379.29785: getting the remaining hosts for this loop 37031 1727204379.29786: done getting the remaining hosts for this loop 37031 1727204379.29788: getting the next task for host managed-node2 37031 1727204379.29792: done getting next task for host managed-node2 37031 1727204379.29794: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 37031 1727204379.29796: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.29798: getting variables 37031 1727204379.29799: in VariableManager get_vars() 37031 1727204379.29806: Calling all_inventory to load vars for managed-node2 37031 1727204379.29808: Calling groups_inventory to load vars for managed-node2 37031 1727204379.29810: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.29815: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.29817: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.29820: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.29958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.30141: done with get_vars() 37031 1727204379.30152: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:59:39 -0400 (0:00:00.027) 0:00:01.847 ***** 37031 1727204379.30217: entering _queue_task() for managed-node2/setup 37031 1727204379.30502: worker is 1 (out of 1 available) 37031 1727204379.30513: exiting _queue_task() for managed-node2/setup 37031 1727204379.30525: done queuing things up, now waiting for results queue to drain 37031 1727204379.30527: waiting for pending results... 37031 1727204379.30770: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 37031 1727204379.30883: in run() - task 0affcd87-79f5-b754-dfb8-0000000000ca 37031 1727204379.31001: variable 'ansible_search_path' from source: unknown 37031 1727204379.31012: variable 'ansible_search_path' from source: unknown 37031 1727204379.31052: calling self._execute() 37031 1727204379.31124: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.31139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.31152: variable 'omit' from source: magic vars 37031 1727204379.31800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204379.34186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204379.34238: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204379.34272: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204379.34299: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204379.34319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204379.34424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204379.34458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204379.34497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204379.34542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204379.34568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204379.34787: variable 'ansible_facts' from source: unknown 37031 1727204379.34860: variable 'network_test_required_facts' from source: task vars 37031 1727204379.34917: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 37031 1727204379.34924: when evaluation is False, skipping this task 37031 1727204379.34931: _execute() done 37031 1727204379.34938: dumping result to json 37031 1727204379.34945: done dumping result, returning 37031 1727204379.34955: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-b754-dfb8-0000000000ca] 37031 1727204379.34965: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ca 37031 1727204379.35091: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ca 37031 1727204379.35098: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 37031 1727204379.35179: no more pending results, returning what we have 37031 1727204379.35183: results queue empty 37031 1727204379.35184: checking for any_errors_fatal 37031 1727204379.35186: done checking for any_errors_fatal 37031 1727204379.35186: checking for max_fail_percentage 37031 1727204379.35188: done checking for max_fail_percentage 37031 1727204379.35189: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.35190: done checking to see if all hosts have failed 37031 1727204379.35191: getting the remaining hosts for this loop 37031 1727204379.35193: done getting the remaining hosts for this loop 37031 1727204379.35198: getting the next task for host managed-node2 37031 1727204379.35210: done getting next task for host managed-node2 37031 1727204379.35214: ^ task is: TASK: Check if system is ostree 37031 1727204379.35217: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.35222: getting variables 37031 1727204379.35224: in VariableManager get_vars() 37031 1727204379.35298: Calling all_inventory to load vars for managed-node2 37031 1727204379.35301: Calling groups_inventory to load vars for managed-node2 37031 1727204379.35305: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.35315: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.35317: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.35320: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.35540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.35760: done with get_vars() 37031 1727204379.35773: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:59:39 -0400 (0:00:00.057) 0:00:01.904 ***** 37031 1727204379.36001: entering _queue_task() for managed-node2/stat 37031 1727204379.36469: worker is 1 (out of 1 available) 37031 1727204379.36481: exiting _queue_task() for managed-node2/stat 37031 1727204379.36512: done queuing things up, now waiting for results queue to drain 37031 1727204379.36514: waiting for pending results... 37031 1727204379.36752: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 37031 1727204379.36823: in run() - task 0affcd87-79f5-b754-dfb8-0000000000cc 37031 1727204379.36836: variable 'ansible_search_path' from source: unknown 37031 1727204379.36845: variable 'ansible_search_path' from source: unknown 37031 1727204379.36878: calling self._execute() 37031 1727204379.36926: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.36930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.36938: variable 'omit' from source: magic vars 37031 1727204379.37289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204379.37465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204379.37500: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204379.37526: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204379.37555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204379.37643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204379.37665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204379.37684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204379.37703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204379.37795: Evaluated conditional (not __network_is_ostree is defined): True 37031 1727204379.37800: variable 'omit' from source: magic vars 37031 1727204379.37830: variable 'omit' from source: magic vars 37031 1727204379.37856: variable 'omit' from source: magic vars 37031 1727204379.37878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204379.37898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204379.37911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204379.37932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204379.37935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204379.37960: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204379.37963: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.37967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.38027: Set connection var ansible_connection to ssh 37031 1727204379.38032: Set connection var ansible_shell_type to sh 37031 1727204379.38035: Set connection var ansible_pipelining to False 37031 1727204379.38044: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204379.38052: Set connection var ansible_timeout to 10 37031 1727204379.38057: Set connection var ansible_shell_executable to /bin/sh 37031 1727204379.38082: variable 'ansible_shell_executable' from source: unknown 37031 1727204379.38085: variable 'ansible_connection' from source: unknown 37031 1727204379.38088: variable 'ansible_module_compression' from source: unknown 37031 1727204379.38090: variable 'ansible_shell_type' from source: unknown 37031 1727204379.38092: variable 'ansible_shell_executable' from source: unknown 37031 1727204379.38094: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.38097: variable 'ansible_pipelining' from source: unknown 37031 1727204379.38101: variable 'ansible_timeout' from source: unknown 37031 1727204379.38104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.38209: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204379.38217: variable 'omit' from source: magic vars 37031 1727204379.38222: starting attempt loop 37031 1727204379.38224: running the handler 37031 1727204379.38234: _low_level_execute_command(): starting 37031 1727204379.38241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204379.38785: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.38803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.38827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.38841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.38888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.38900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.38956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204379.41126: stdout chunk (state=3): >>>/root <<< 37031 1727204379.41276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.41329: stderr chunk (state=3): >>><<< 37031 1727204379.41332: stdout chunk (state=3): >>><<< 37031 1727204379.41351: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204379.41363: _low_level_execute_command(): starting 37031 1727204379.41373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429 `" && echo ansible-tmp-1727204379.413507-37250-123011068968429="` echo /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429 `" ) && sleep 0' 37031 1727204379.41836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.41848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.41859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204379.41872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204379.41884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.41937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.41957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.41994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204379.44595: stdout chunk (state=3): >>>ansible-tmp-1727204379.413507-37250-123011068968429=/root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429 <<< 37031 1727204379.44752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.44803: stderr chunk (state=3): >>><<< 37031 1727204379.44809: stdout chunk (state=3): >>><<< 37031 1727204379.44830: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204379.413507-37250-123011068968429=/root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204379.44872: variable 'ansible_module_compression' from source: unknown 37031 1727204379.44917: ANSIBALLZ: Using lock for stat 37031 1727204379.44921: ANSIBALLZ: Acquiring lock 37031 1727204379.44928: ANSIBALLZ: Lock acquired: 140694173155296 37031 1727204379.44930: ANSIBALLZ: Creating module 37031 1727204379.55646: ANSIBALLZ: Writing module into payload 37031 1727204379.55808: ANSIBALLZ: Writing module 37031 1727204379.55848: ANSIBALLZ: Renaming module 37031 1727204379.55859: ANSIBALLZ: Done creating module 37031 1727204379.55889: variable 'ansible_facts' from source: unknown 37031 1727204379.55958: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/AnsiballZ_stat.py 37031 1727204379.56126: Sending initial data 37031 1727204379.56136: Sent initial data (152 bytes) 37031 1727204379.58048: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204379.58065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.58083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.58101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.58152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.58372: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204379.58558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.58584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204379.58606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204379.58621: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204379.58634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.58650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.58693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.58710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.58721: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204379.58736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.58887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.58926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204379.58945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.59032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204379.60800: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204379.60804: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp1c31yb28 /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/AnsiballZ_stat.py <<< 37031 1727204379.60837: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204379.62209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.62393: stderr chunk (state=3): >>><<< 37031 1727204379.62397: stdout chunk (state=3): >>><<< 37031 1727204379.62399: done transferring module to remote 37031 1727204379.62406: _low_level_execute_command(): starting 37031 1727204379.62408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/ /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/AnsiballZ_stat.py && sleep 0' 37031 1727204379.63933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204379.63947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.63962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.63982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.64029: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.64045: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204379.64059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.64078: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204379.64146: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204379.64161: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204379.64175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.64188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.64202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.64212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.64222: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204379.64233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.64318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.64486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204379.64502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.64581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204379.66504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.66508: stdout chunk (state=3): >>><<< 37031 1727204379.66510: stderr chunk (state=3): >>><<< 37031 1727204379.66614: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204379.66618: _low_level_execute_command(): starting 37031 1727204379.66620: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/AnsiballZ_stat.py && sleep 0' 37031 1727204379.67600: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204379.67623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.67641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.67661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.67708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.67730: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204379.67745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.67766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204379.67780: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204379.67791: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204379.67804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.67820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.67844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.67857: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204379.67872: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204379.67891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.67977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.67998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204379.68015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.68100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204379.70046: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # <<< 37031 1727204379.70049: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 37031 1727204379.70109: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 37031 1727204379.70141: stdout chunk (state=3): >>>import 'posix' # <<< 37031 1727204379.70180: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 37031 1727204379.70183: stdout chunk (state=3): >>># installing zipimport hook <<< 37031 1727204379.70223: stdout chunk (state=3): >>>import 'time' # <<< 37031 1727204379.70226: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 37031 1727204379.70282: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.70293: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 37031 1727204379.70322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 37031 1727204379.70352: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098dc0> <<< 37031 1727204379.70389: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 37031 1727204379.70415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d3a0> <<< 37031 1727204379.70418: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098b20> <<< 37031 1727204379.70441: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 37031 1727204379.70446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 37031 1727204379.70468: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098ac0> <<< 37031 1727204379.70483: stdout chunk (state=3): >>>import '_signal' # <<< 37031 1727204379.70516: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 37031 1727204379.70523: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d490> <<< 37031 1727204379.70571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 37031 1727204379.70593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 37031 1727204379.70600: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d940> <<< 37031 1727204379.70628: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d670> <<< 37031 1727204379.70663: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 37031 1727204379.70672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 37031 1727204379.70688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 37031 1727204379.70710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 37031 1727204379.70724: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 37031 1727204379.70750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 37031 1727204379.70779: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf190> <<< 37031 1727204379.70794: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 37031 1727204379.70819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 37031 1727204379.70891: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf220> <<< 37031 1727204379.70920: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 37031 1727204379.70963: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf940> <<< 37031 1727204379.71009: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8055880> <<< 37031 1727204379.71027: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dc8d90> <<< 37031 1727204379.71093: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 37031 1727204379.71100: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7df2d90> <<< 37031 1727204379.71155: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d970> <<< 37031 1727204379.71181: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 37031 1727204379.71380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 37031 1727204379.71397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 37031 1727204379.71421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 37031 1727204379.71449: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 37031 1727204379.71466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 37031 1727204379.71495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 37031 1727204379.71514: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6df10> <<< 37031 1727204379.71582: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d740a0> <<< 37031 1727204379.71589: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 37031 1727204379.71613: stdout chunk (state=3): >>>import '_sre' # <<< 37031 1727204379.71628: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 37031 1727204379.71650: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 37031 1727204379.71677: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 37031 1727204379.71703: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d675b0> <<< 37031 1727204379.71720: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6d3d0> <<< 37031 1727204379.71748: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 37031 1727204379.71822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 37031 1727204379.71840: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 37031 1727204379.71875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.71898: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 37031 1727204379.71947: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7cf1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf19a0> <<< 37031 1727204379.71953: stdout chunk (state=3): >>>import 'itertools' # <<< 37031 1727204379.71981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf1fa0> <<< 37031 1727204379.72016: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 37031 1727204379.72044: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf1df0> <<< 37031 1727204379.72091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01160> <<< 37031 1727204379.72097: stdout chunk (state=3): >>>import '_collections' # <<< 37031 1727204379.72151: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d49e20> import '_functools' # <<< 37031 1727204379.72184: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d41700> <<< 37031 1727204379.72237: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d55760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d75eb0> <<< 37031 1727204379.72277: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 37031 1727204379.72309: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7d01d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d49340> <<< 37031 1727204379.72359: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204379.72368: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7d55370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d7ba60> <<< 37031 1727204379.72389: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 37031 1727204379.72434: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.72467: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 37031 1727204379.72474: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01e80> <<< 37031 1727204379.72506: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01df0> <<< 37031 1727204379.72536: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 37031 1727204379.72573: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 37031 1727204379.72600: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 37031 1727204379.73348: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cd5460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cd5550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cb30d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d04b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d044c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c092b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cc0d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d04fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c19be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7c19f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2c820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2cd60> <<< 37031 1727204379.73357: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bc5490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c19f40> <<< 37031 1727204379.73782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bd5370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2c6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bd5430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf1850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 37031 1727204379.74055: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204379.74077: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bfb2e0> <<< 37031 1727204379.74102: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf19d0> <<< 37031 1727204379.74115: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7be5b20> <<< 37031 1727204379.74149: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d016a0> <<< 37031 1727204379.74194: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 37031 1727204379.74284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 37031 1727204379.74337: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf1b80> <<< 37031 1727204379.74508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff6b75e6760> <<< 37031 1727204379.74743: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip' <<< 37031 1727204379.74746: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.74877: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.74923: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 37031 1727204379.74957: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.74973: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 37031 1727204379.76777: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.77776: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d8b0> <<< 37031 1727204379.77811: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.77838: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 37031 1727204379.77855: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b750d160> <<< 37031 1727204379.77882: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d280> <<< 37031 1727204379.77913: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d5e0> <<< 37031 1727204379.77939: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 37031 1727204379.77991: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750de20> import 'atexit' # <<< 37031 1727204379.78030: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b750d580> <<< 37031 1727204379.78042: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 37031 1727204379.78071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 37031 1727204379.78112: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d100> <<< 37031 1727204379.78150: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 37031 1727204379.78155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 37031 1727204379.78198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 37031 1727204379.78214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 37031 1727204379.78217: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 37031 1727204379.78281: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7464fd0> <<< 37031 1727204379.78318: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7482c40> <<< 37031 1727204379.78345: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7482f40> <<< 37031 1727204379.78371: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 37031 1727204379.78401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 37031 1727204379.78438: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74822e0> <<< 37031 1727204379.78449: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7575d90> <<< 37031 1727204379.78625: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75753a0> <<< 37031 1727204379.78717: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 37031 1727204379.78720: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7575f40> <<< 37031 1727204379.78761: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 37031 1727204379.78783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 37031 1727204379.78786: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75e6a90> <<< 37031 1727204379.78871: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e0dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e0490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7517580> <<< 37031 1727204379.78910: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74e05b0> <<< 37031 1727204379.78948: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e05e0> <<< 37031 1727204379.78985: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 37031 1727204379.79010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 37031 1727204379.79025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 37031 1727204379.79115: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7455f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75552e0> <<< 37031 1727204379.79129: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 37031 1727204379.79186: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7555460> <<< 37031 1727204379.79213: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 37031 1727204379.79285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.79288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 37031 1727204379.79342: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b756df40> <<< 37031 1727204379.79475: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7452790> <<< 37031 1727204379.79559: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74525e0> <<< 37031 1727204379.79593: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7451550> <<< 37031 1727204379.79661: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7451490> <<< 37031 1727204379.79679: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b754c9a0> <<< 37031 1727204379.79703: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 37031 1727204379.79756: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d66a0> <<< 37031 1727204379.79942: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d5bb0> <<< 37031 1727204379.79945: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e60d0> <<< 37031 1727204379.79991: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' <<< 37031 1727204379.80023: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7519c40> # zipimport: zlib available # zipimport: zlib available <<< 37031 1727204379.80042: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 37031 1727204379.80045: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.80108: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.80212: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.80237: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 37031 1727204379.80257: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 37031 1727204379.80358: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.80449: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.80904: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.81378: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 37031 1727204379.81418: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 37031 1727204379.81421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.81474: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b741e940> <<< 37031 1727204379.81551: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 37031 1727204379.81566: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74d3d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74ca7c0> <<< 37031 1727204379.81621: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 37031 1727204379.81659: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 37031 1727204379.81662: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 37031 1727204379.81782: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.81912: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 37031 1727204379.81955: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74d54c0> <<< 37031 1727204379.81959: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82338: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82714: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82767: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82838: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 37031 1727204379.82844: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82872: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82911: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 37031 1727204379.82914: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.82974: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83078: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 37031 1727204379.83091: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 37031 1727204379.83120: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83165: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 37031 1727204379.83350: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83543: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 37031 1727204379.83578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 37031 1727204379.83657: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b6fb4940> <<< 37031 1727204379.83684: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83728: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83798: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 37031 1727204379.83823: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83859: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83898: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 37031 1727204379.83947: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.83982: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84081: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84138: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 37031 1727204379.84192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 37031 1727204379.84237: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7560b50> <<< 37031 1727204379.84268: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b6fb3070> <<< 37031 1727204379.84386: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 37031 1727204379.84390: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84423: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84483: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84512: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84576: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 37031 1727204379.84581: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 37031 1727204379.84593: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 37031 1727204379.84639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 37031 1727204379.84659: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 37031 1727204379.84662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 37031 1727204379.84738: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b70046d0> <<< 37031 1727204379.84777: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7414c10> <<< 37031 1727204379.84858: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74135b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 37031 1727204379.84863: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84898: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.84902: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 37031 1727204379.84997: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 37031 1727204379.85013: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 37031 1727204379.85016: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.85144: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.85300: stdout chunk (state=3): >>># zipimport: zlib available <<< 37031 1727204379.85432: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 37031 1727204379.85709: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path <<< 37031 1727204379.85749: stdout chunk (state=3): >>># clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools <<< 37031 1727204379.85778: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json <<< 37031 1727204379.85782: stdout chunk (state=3): >>># cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 37031 1727204379.86000: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 37031 1727204379.86083: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 37031 1727204379.86151: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 37031 1727204379.86218: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize <<< 37031 1727204379.86275: stdout chunk (state=3): >>># cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 37031 1727204379.86289: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 37031 1727204379.86438: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq <<< 37031 1727204379.86488: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 37031 1727204379.86500: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 37031 1727204379.86919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204379.86922: stdout chunk (state=3): >>><<< 37031 1727204379.86927: stderr chunk (state=3): >>><<< 37031 1727204379.87042: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8098ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7df2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dcf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b8055880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7dc8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7df2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b803d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d6d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7cf1eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf19a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf1fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cf1df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d49e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d41700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d55760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d75eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7d01d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d49340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7d55370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d7ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cd5460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cd5550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cb30d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d04b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d044c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c092b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7cc0d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d04fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d7b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c19be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7c19f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2c820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2cd60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bc5490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c19f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bd5370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7c2c6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bd5430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d01ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf1850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bf1d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7bfb2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf19d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7be5b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7d016a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7bf1b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff6b75e6760> # zipimport: found 30 names in '/tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b750d160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750de20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b750d580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b750d100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7464fd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7482c40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7482f40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74822e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7575d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75753a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7575f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75e6a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e0dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e0490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7517580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74e05b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e05e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7455f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b75552e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74527f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7555460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b756df40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7452790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74525e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7451550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7451490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b754c9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d66a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d5bb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74e60d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b74d6100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7519c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b741e940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74d3d30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74ca7c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74d54c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b6fb4940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff6b7560b50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b6fb3070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b70046d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b7414c10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff6b74135b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload__jk0u2bt/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 37031 1727204379.89404: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204379.89408: _low_level_execute_command(): starting 37031 1727204379.89411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204379.413507-37250-123011068968429/ > /dev/null 2>&1 && sleep 0' 37031 1727204379.89462: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204379.89472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204379.89521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204379.89524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.89526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204379.89528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204379.89530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204379.89533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204379.89605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204379.89609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204379.89759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204379.89815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204379.91621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204379.91707: stderr chunk (state=3): >>><<< 37031 1727204379.91711: stdout chunk (state=3): >>><<< 37031 1727204379.91873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204379.91877: handler run complete 37031 1727204379.91879: attempt loop complete, returning result 37031 1727204379.91882: _execute() done 37031 1727204379.91884: dumping result to json 37031 1727204379.91886: done dumping result, returning 37031 1727204379.91888: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [0affcd87-79f5-b754-dfb8-0000000000cc] 37031 1727204379.91890: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cc 37031 1727204379.91957: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cc 37031 1727204379.91960: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 37031 1727204379.92129: no more pending results, returning what we have 37031 1727204379.92132: results queue empty 37031 1727204379.92133: checking for any_errors_fatal 37031 1727204379.92138: done checking for any_errors_fatal 37031 1727204379.92139: checking for max_fail_percentage 37031 1727204379.92141: done checking for max_fail_percentage 37031 1727204379.92142: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.92143: done checking to see if all hosts have failed 37031 1727204379.92143: getting the remaining hosts for this loop 37031 1727204379.92145: done getting the remaining hosts for this loop 37031 1727204379.92149: getting the next task for host managed-node2 37031 1727204379.92156: done getting next task for host managed-node2 37031 1727204379.92158: ^ task is: TASK: Set flag to indicate system is ostree 37031 1727204379.92161: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.92168: getting variables 37031 1727204379.92170: in VariableManager get_vars() 37031 1727204379.92201: Calling all_inventory to load vars for managed-node2 37031 1727204379.92203: Calling groups_inventory to load vars for managed-node2 37031 1727204379.92207: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.92218: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.92220: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.92224: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.92544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.92739: done with get_vars() 37031 1727204379.92750: done getting variables 37031 1727204379.92845: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:59:39 -0400 (0:00:00.570) 0:00:02.475 ***** 37031 1727204379.93078: entering _queue_task() for managed-node2/set_fact 37031 1727204379.93080: Creating lock for set_fact 37031 1727204379.93905: worker is 1 (out of 1 available) 37031 1727204379.93917: exiting _queue_task() for managed-node2/set_fact 37031 1727204379.93929: done queuing things up, now waiting for results queue to drain 37031 1727204379.93930: waiting for pending results... 37031 1727204379.94807: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 37031 1727204379.95032: in run() - task 0affcd87-79f5-b754-dfb8-0000000000cd 37031 1727204379.95051: variable 'ansible_search_path' from source: unknown 37031 1727204379.95059: variable 'ansible_search_path' from source: unknown 37031 1727204379.95101: calling self._execute() 37031 1727204379.95290: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.95301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.95314: variable 'omit' from source: magic vars 37031 1727204379.96526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204379.96881: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204379.96933: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204379.96981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204379.97021: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204379.97123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204379.97168: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204379.97201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204379.97234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204379.97387: Evaluated conditional (not __network_is_ostree is defined): True 37031 1727204379.97399: variable 'omit' from source: magic vars 37031 1727204379.97444: variable 'omit' from source: magic vars 37031 1727204379.97591: variable '__ostree_booted_stat' from source: set_fact 37031 1727204379.97660: variable 'omit' from source: magic vars 37031 1727204379.97697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204379.97739: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204379.97770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204379.97792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204379.97809: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204379.97851: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204379.97867: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.97876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.97991: Set connection var ansible_connection to ssh 37031 1727204379.97999: Set connection var ansible_shell_type to sh 37031 1727204379.98011: Set connection var ansible_pipelining to False 37031 1727204379.98022: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204379.98042: Set connection var ansible_timeout to 10 37031 1727204379.98059: Set connection var ansible_shell_executable to /bin/sh 37031 1727204379.98093: variable 'ansible_shell_executable' from source: unknown 37031 1727204379.98102: variable 'ansible_connection' from source: unknown 37031 1727204379.98109: variable 'ansible_module_compression' from source: unknown 37031 1727204379.98116: variable 'ansible_shell_type' from source: unknown 37031 1727204379.98122: variable 'ansible_shell_executable' from source: unknown 37031 1727204379.98130: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204379.98140: variable 'ansible_pipelining' from source: unknown 37031 1727204379.98156: variable 'ansible_timeout' from source: unknown 37031 1727204379.98170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204379.98317: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204379.98334: variable 'omit' from source: magic vars 37031 1727204379.98343: starting attempt loop 37031 1727204379.98349: running the handler 37031 1727204379.98373: handler run complete 37031 1727204379.98391: attempt loop complete, returning result 37031 1727204379.98399: _execute() done 37031 1727204379.98407: dumping result to json 37031 1727204379.98415: done dumping result, returning 37031 1727204379.98425: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [0affcd87-79f5-b754-dfb8-0000000000cd] 37031 1727204379.98435: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cd ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 37031 1727204379.98590: no more pending results, returning what we have 37031 1727204379.98593: results queue empty 37031 1727204379.98594: checking for any_errors_fatal 37031 1727204379.98598: done checking for any_errors_fatal 37031 1727204379.98599: checking for max_fail_percentage 37031 1727204379.98601: done checking for max_fail_percentage 37031 1727204379.98602: checking to see if all hosts have failed and the running result is not ok 37031 1727204379.98603: done checking to see if all hosts have failed 37031 1727204379.98603: getting the remaining hosts for this loop 37031 1727204379.98605: done getting the remaining hosts for this loop 37031 1727204379.98609: getting the next task for host managed-node2 37031 1727204379.98618: done getting next task for host managed-node2 37031 1727204379.98620: ^ task is: TASK: Fix CentOS6 Base repo 37031 1727204379.98622: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204379.98625: getting variables 37031 1727204379.98627: in VariableManager get_vars() 37031 1727204379.98656: Calling all_inventory to load vars for managed-node2 37031 1727204379.98659: Calling groups_inventory to load vars for managed-node2 37031 1727204379.98662: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204379.98671: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cd 37031 1727204379.98684: WORKER PROCESS EXITING 37031 1727204379.98695: Calling all_plugins_play to load vars for managed-node2 37031 1727204379.98699: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204379.98707: Calling groups_plugins_play to load vars for managed-node2 37031 1727204379.98909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204379.99112: done with get_vars() 37031 1727204379.99124: done getting variables 37031 1727204379.99251: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:59:39 -0400 (0:00:00.064) 0:00:02.539 ***** 37031 1727204379.99493: entering _queue_task() for managed-node2/copy 37031 1727204379.99993: worker is 1 (out of 1 available) 37031 1727204380.00005: exiting _queue_task() for managed-node2/copy 37031 1727204380.00015: done queuing things up, now waiting for results queue to drain 37031 1727204380.00017: waiting for pending results... 37031 1727204380.00807: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 37031 1727204380.00929: in run() - task 0affcd87-79f5-b754-dfb8-0000000000cf 37031 1727204380.01093: variable 'ansible_search_path' from source: unknown 37031 1727204380.01102: variable 'ansible_search_path' from source: unknown 37031 1727204380.01144: calling self._execute() 37031 1727204380.01342: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.01357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.01374: variable 'omit' from source: magic vars 37031 1727204380.02409: variable 'ansible_distribution' from source: facts 37031 1727204380.02439: Evaluated conditional (ansible_distribution == 'CentOS'): True 37031 1727204380.02718: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.02733: Evaluated conditional (ansible_distribution_major_version == '6'): False 37031 1727204380.02741: when evaluation is False, skipping this task 37031 1727204380.02771: _execute() done 37031 1727204380.02815: dumping result to json 37031 1727204380.02839: done dumping result, returning 37031 1727204380.02850: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [0affcd87-79f5-b754-dfb8-0000000000cf] 37031 1727204380.02917: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cf skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 37031 1727204380.03098: no more pending results, returning what we have 37031 1727204380.03102: results queue empty 37031 1727204380.03103: checking for any_errors_fatal 37031 1727204380.03107: done checking for any_errors_fatal 37031 1727204380.03108: checking for max_fail_percentage 37031 1727204380.03110: done checking for max_fail_percentage 37031 1727204380.03111: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.03112: done checking to see if all hosts have failed 37031 1727204380.03113: getting the remaining hosts for this loop 37031 1727204380.03115: done getting the remaining hosts for this loop 37031 1727204380.03120: getting the next task for host managed-node2 37031 1727204380.03128: done getting next task for host managed-node2 37031 1727204380.03131: ^ task is: TASK: Include the task 'enable_epel.yml' 37031 1727204380.03134: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.03138: getting variables 37031 1727204380.03140: in VariableManager get_vars() 37031 1727204380.03174: Calling all_inventory to load vars for managed-node2 37031 1727204380.03178: Calling groups_inventory to load vars for managed-node2 37031 1727204380.03182: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.03197: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.03200: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.03204: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.03385: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000cf 37031 1727204380.03389: WORKER PROCESS EXITING 37031 1727204380.03404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.03580: done with get_vars() 37031 1727204380.03591: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.041) 0:00:02.581 ***** 37031 1727204380.03684: entering _queue_task() for managed-node2/include_tasks 37031 1727204380.04158: worker is 1 (out of 1 available) 37031 1727204380.04371: exiting _queue_task() for managed-node2/include_tasks 37031 1727204380.04382: done queuing things up, now waiting for results queue to drain 37031 1727204380.04384: waiting for pending results... 37031 1727204380.05094: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 37031 1727204380.05391: in run() - task 0affcd87-79f5-b754-dfb8-0000000000d0 37031 1727204380.05494: variable 'ansible_search_path' from source: unknown 37031 1727204380.05502: variable 'ansible_search_path' from source: unknown 37031 1727204380.05547: calling self._execute() 37031 1727204380.05762: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.05779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.05818: variable 'omit' from source: magic vars 37031 1727204380.06892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204380.09537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204380.09620: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204380.09674: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204380.09734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204380.09771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204380.09869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204380.09904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204380.09940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204380.09991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204380.10010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204380.10140: variable '__network_is_ostree' from source: set_fact 37031 1727204380.10177: Evaluated conditional (not __network_is_ostree | d(false)): True 37031 1727204380.10187: _execute() done 37031 1727204380.10193: dumping result to json 37031 1727204380.10199: done dumping result, returning 37031 1727204380.10207: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-b754-dfb8-0000000000d0] 37031 1727204380.10213: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000d0 37031 1727204380.10333: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000d0 37031 1727204380.10340: WORKER PROCESS EXITING 37031 1727204380.10368: no more pending results, returning what we have 37031 1727204380.10373: in VariableManager get_vars() 37031 1727204380.10407: Calling all_inventory to load vars for managed-node2 37031 1727204380.10410: Calling groups_inventory to load vars for managed-node2 37031 1727204380.10413: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.10422: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.10425: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.10427: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.10632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.10797: done with get_vars() 37031 1727204380.10806: variable 'ansible_search_path' from source: unknown 37031 1727204380.10807: variable 'ansible_search_path' from source: unknown 37031 1727204380.10844: we have included files to process 37031 1727204380.10845: generating all_blocks data 37031 1727204380.10847: done generating all_blocks data 37031 1727204380.10852: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 37031 1727204380.10853: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 37031 1727204380.10856: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 37031 1727204380.12269: done processing included file 37031 1727204380.12272: iterating over new_blocks loaded from include file 37031 1727204380.12274: in VariableManager get_vars() 37031 1727204380.12289: done with get_vars() 37031 1727204380.12291: filtering new block on tags 37031 1727204380.12315: done filtering new block on tags 37031 1727204380.12318: in VariableManager get_vars() 37031 1727204380.12330: done with get_vars() 37031 1727204380.12332: filtering new block on tags 37031 1727204380.12343: done filtering new block on tags 37031 1727204380.12345: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 37031 1727204380.12351: extending task lists for all hosts with included blocks 37031 1727204380.12463: done extending task lists 37031 1727204380.13167: done processing included files 37031 1727204380.13169: results queue empty 37031 1727204380.13169: checking for any_errors_fatal 37031 1727204380.13173: done checking for any_errors_fatal 37031 1727204380.13174: checking for max_fail_percentage 37031 1727204380.13176: done checking for max_fail_percentage 37031 1727204380.13176: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.13177: done checking to see if all hosts have failed 37031 1727204380.13178: getting the remaining hosts for this loop 37031 1727204380.13180: done getting the remaining hosts for this loop 37031 1727204380.13183: getting the next task for host managed-node2 37031 1727204380.13187: done getting next task for host managed-node2 37031 1727204380.13190: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 37031 1727204380.13193: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.13196: getting variables 37031 1727204380.13197: in VariableManager get_vars() 37031 1727204380.13208: Calling all_inventory to load vars for managed-node2 37031 1727204380.13210: Calling groups_inventory to load vars for managed-node2 37031 1727204380.13212: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.13219: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.13228: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.13233: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.13401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.13581: done with get_vars() 37031 1727204380.13590: done getting variables 37031 1727204380.13661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 37031 1727204380.14569: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.109) 0:00:02.690 ***** 37031 1727204380.14612: entering _queue_task() for managed-node2/command 37031 1727204380.14613: Creating lock for command 37031 1727204380.14917: worker is 1 (out of 1 available) 37031 1727204380.14928: exiting _queue_task() for managed-node2/command 37031 1727204380.14938: done queuing things up, now waiting for results queue to drain 37031 1727204380.14939: waiting for pending results... 37031 1727204380.15514: running TaskExecutor() for managed-node2/TASK: Create EPEL 9 37031 1727204380.15725: in run() - task 0affcd87-79f5-b754-dfb8-0000000000ea 37031 1727204380.15736: variable 'ansible_search_path' from source: unknown 37031 1727204380.15740: variable 'ansible_search_path' from source: unknown 37031 1727204380.15777: calling self._execute() 37031 1727204380.15961: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.15966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.15976: variable 'omit' from source: magic vars 37031 1727204380.16814: variable 'ansible_distribution' from source: facts 37031 1727204380.16824: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 37031 1727204380.16972: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.16978: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 37031 1727204380.16982: when evaluation is False, skipping this task 37031 1727204380.16984: _execute() done 37031 1727204380.16989: dumping result to json 37031 1727204380.16991: done dumping result, returning 37031 1727204380.16999: done running TaskExecutor() for managed-node2/TASK: Create EPEL 9 [0affcd87-79f5-b754-dfb8-0000000000ea] 37031 1727204380.17004: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ea 37031 1727204380.17113: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ea 37031 1727204380.17116: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 37031 1727204380.17170: no more pending results, returning what we have 37031 1727204380.17175: results queue empty 37031 1727204380.17176: checking for any_errors_fatal 37031 1727204380.17177: done checking for any_errors_fatal 37031 1727204380.17178: checking for max_fail_percentage 37031 1727204380.17179: done checking for max_fail_percentage 37031 1727204380.17180: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.17181: done checking to see if all hosts have failed 37031 1727204380.17182: getting the remaining hosts for this loop 37031 1727204380.17183: done getting the remaining hosts for this loop 37031 1727204380.17187: getting the next task for host managed-node2 37031 1727204380.17194: done getting next task for host managed-node2 37031 1727204380.17197: ^ task is: TASK: Install yum-utils package 37031 1727204380.17202: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.17205: getting variables 37031 1727204380.17207: in VariableManager get_vars() 37031 1727204380.17238: Calling all_inventory to load vars for managed-node2 37031 1727204380.17241: Calling groups_inventory to load vars for managed-node2 37031 1727204380.17244: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.17258: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.17260: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.17265: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.17431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.17852: done with get_vars() 37031 1727204380.17861: done getting variables 37031 1727204380.17998: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.034) 0:00:02.725 ***** 37031 1727204380.18032: entering _queue_task() for managed-node2/package 37031 1727204380.18034: Creating lock for package 37031 1727204380.18468: worker is 1 (out of 1 available) 37031 1727204380.18480: exiting _queue_task() for managed-node2/package 37031 1727204380.18492: done queuing things up, now waiting for results queue to drain 37031 1727204380.18493: waiting for pending results... 37031 1727204380.18726: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 37031 1727204380.18823: in run() - task 0affcd87-79f5-b754-dfb8-0000000000eb 37031 1727204380.18836: variable 'ansible_search_path' from source: unknown 37031 1727204380.18839: variable 'ansible_search_path' from source: unknown 37031 1727204380.18880: calling self._execute() 37031 1727204380.18940: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.18946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.18978: variable 'omit' from source: magic vars 37031 1727204380.19345: variable 'ansible_distribution' from source: facts 37031 1727204380.19358: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 37031 1727204380.19479: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.19483: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 37031 1727204380.19486: when evaluation is False, skipping this task 37031 1727204380.19491: _execute() done 37031 1727204380.19494: dumping result to json 37031 1727204380.19503: done dumping result, returning 37031 1727204380.19510: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [0affcd87-79f5-b754-dfb8-0000000000eb] 37031 1727204380.19514: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000eb skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 37031 1727204380.19653: no more pending results, returning what we have 37031 1727204380.19657: results queue empty 37031 1727204380.19658: checking for any_errors_fatal 37031 1727204380.19668: done checking for any_errors_fatal 37031 1727204380.19669: checking for max_fail_percentage 37031 1727204380.19670: done checking for max_fail_percentage 37031 1727204380.19671: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.19672: done checking to see if all hosts have failed 37031 1727204380.19673: getting the remaining hosts for this loop 37031 1727204380.19675: done getting the remaining hosts for this loop 37031 1727204380.19679: getting the next task for host managed-node2 37031 1727204380.19687: done getting next task for host managed-node2 37031 1727204380.19692: ^ task is: TASK: Enable EPEL 7 37031 1727204380.19697: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.19700: getting variables 37031 1727204380.19702: in VariableManager get_vars() 37031 1727204380.19731: Calling all_inventory to load vars for managed-node2 37031 1727204380.19734: Calling groups_inventory to load vars for managed-node2 37031 1727204380.19737: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.19752: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.19756: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.19759: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.19924: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000eb 37031 1727204380.19930: WORKER PROCESS EXITING 37031 1727204380.19946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.20135: done with get_vars() 37031 1727204380.20145: done getting variables 37031 1727204380.20224: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.022) 0:00:02.747 ***** 37031 1727204380.20261: entering _queue_task() for managed-node2/command 37031 1727204380.20739: worker is 1 (out of 1 available) 37031 1727204380.20752: exiting _queue_task() for managed-node2/command 37031 1727204380.20766: done queuing things up, now waiting for results queue to drain 37031 1727204380.20767: waiting for pending results... 37031 1727204380.21008: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 37031 1727204380.21095: in run() - task 0affcd87-79f5-b754-dfb8-0000000000ec 37031 1727204380.21112: variable 'ansible_search_path' from source: unknown 37031 1727204380.21115: variable 'ansible_search_path' from source: unknown 37031 1727204380.21148: calling self._execute() 37031 1727204380.21222: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.21226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.21237: variable 'omit' from source: magic vars 37031 1727204380.21622: variable 'ansible_distribution' from source: facts 37031 1727204380.21635: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 37031 1727204380.21771: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.21777: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 37031 1727204380.21780: when evaluation is False, skipping this task 37031 1727204380.21784: _execute() done 37031 1727204380.21786: dumping result to json 37031 1727204380.21790: done dumping result, returning 37031 1727204380.21796: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [0affcd87-79f5-b754-dfb8-0000000000ec] 37031 1727204380.21800: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ec skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 37031 1727204380.21935: no more pending results, returning what we have 37031 1727204380.21939: results queue empty 37031 1727204380.21940: checking for any_errors_fatal 37031 1727204380.21945: done checking for any_errors_fatal 37031 1727204380.21945: checking for max_fail_percentage 37031 1727204380.21947: done checking for max_fail_percentage 37031 1727204380.21948: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.21949: done checking to see if all hosts have failed 37031 1727204380.21950: getting the remaining hosts for this loop 37031 1727204380.21952: done getting the remaining hosts for this loop 37031 1727204380.21955: getting the next task for host managed-node2 37031 1727204380.21963: done getting next task for host managed-node2 37031 1727204380.21967: ^ task is: TASK: Enable EPEL 8 37031 1727204380.21972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.21975: getting variables 37031 1727204380.21978: in VariableManager get_vars() 37031 1727204380.22006: Calling all_inventory to load vars for managed-node2 37031 1727204380.22009: Calling groups_inventory to load vars for managed-node2 37031 1727204380.22012: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.22018: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ec 37031 1727204380.22032: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.22035: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.22039: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.22263: WORKER PROCESS EXITING 37031 1727204380.22296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.22498: done with get_vars() 37031 1727204380.22507: done getting variables 37031 1727204380.22691: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.024) 0:00:02.772 ***** 37031 1727204380.22725: entering _queue_task() for managed-node2/command 37031 1727204380.23066: worker is 1 (out of 1 available) 37031 1727204380.23081: exiting _queue_task() for managed-node2/command 37031 1727204380.23093: done queuing things up, now waiting for results queue to drain 37031 1727204380.23095: waiting for pending results... 37031 1727204380.23671: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 37031 1727204380.23677: in run() - task 0affcd87-79f5-b754-dfb8-0000000000ed 37031 1727204380.23680: variable 'ansible_search_path' from source: unknown 37031 1727204380.23682: variable 'ansible_search_path' from source: unknown 37031 1727204380.23685: calling self._execute() 37031 1727204380.23687: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.23696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.23698: variable 'omit' from source: magic vars 37031 1727204380.23955: variable 'ansible_distribution' from source: facts 37031 1727204380.23971: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 37031 1727204380.24170: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.24173: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 37031 1727204380.24176: when evaluation is False, skipping this task 37031 1727204380.24179: _execute() done 37031 1727204380.24181: dumping result to json 37031 1727204380.24183: done dumping result, returning 37031 1727204380.24185: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [0affcd87-79f5-b754-dfb8-0000000000ed] 37031 1727204380.24187: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ed 37031 1727204380.24249: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ed 37031 1727204380.24252: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 37031 1727204380.24305: no more pending results, returning what we have 37031 1727204380.24309: results queue empty 37031 1727204380.24310: checking for any_errors_fatal 37031 1727204380.24315: done checking for any_errors_fatal 37031 1727204380.24316: checking for max_fail_percentage 37031 1727204380.24317: done checking for max_fail_percentage 37031 1727204380.24318: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.24319: done checking to see if all hosts have failed 37031 1727204380.24320: getting the remaining hosts for this loop 37031 1727204380.24322: done getting the remaining hosts for this loop 37031 1727204380.24326: getting the next task for host managed-node2 37031 1727204380.24335: done getting next task for host managed-node2 37031 1727204380.24338: ^ task is: TASK: Enable EPEL 6 37031 1727204380.24343: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.24346: getting variables 37031 1727204380.24348: in VariableManager get_vars() 37031 1727204380.24380: Calling all_inventory to load vars for managed-node2 37031 1727204380.24384: Calling groups_inventory to load vars for managed-node2 37031 1727204380.24387: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.24401: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.24404: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.24407: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.24593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.24901: done with get_vars() 37031 1727204380.24912: done getting variables 37031 1727204380.24988: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.022) 0:00:02.795 ***** 37031 1727204380.25021: entering _queue_task() for managed-node2/copy 37031 1727204380.25389: worker is 1 (out of 1 available) 37031 1727204380.25402: exiting _queue_task() for managed-node2/copy 37031 1727204380.25414: done queuing things up, now waiting for results queue to drain 37031 1727204380.25415: waiting for pending results... 37031 1727204380.25659: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 37031 1727204380.25744: in run() - task 0affcd87-79f5-b754-dfb8-0000000000ef 37031 1727204380.25760: variable 'ansible_search_path' from source: unknown 37031 1727204380.25769: variable 'ansible_search_path' from source: unknown 37031 1727204380.25798: calling self._execute() 37031 1727204380.25868: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.25875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.25884: variable 'omit' from source: magic vars 37031 1727204380.26245: variable 'ansible_distribution' from source: facts 37031 1727204380.26258: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 37031 1727204380.26374: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.26380: Evaluated conditional (ansible_distribution_major_version == '6'): False 37031 1727204380.26383: when evaluation is False, skipping this task 37031 1727204380.26385: _execute() done 37031 1727204380.26388: dumping result to json 37031 1727204380.26393: done dumping result, returning 37031 1727204380.26398: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [0affcd87-79f5-b754-dfb8-0000000000ef] 37031 1727204380.26407: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ef 37031 1727204380.26509: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000ef 37031 1727204380.26513: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 37031 1727204380.26557: no more pending results, returning what we have 37031 1727204380.26562: results queue empty 37031 1727204380.26563: checking for any_errors_fatal 37031 1727204380.26569: done checking for any_errors_fatal 37031 1727204380.26570: checking for max_fail_percentage 37031 1727204380.26571: done checking for max_fail_percentage 37031 1727204380.26572: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.26574: done checking to see if all hosts have failed 37031 1727204380.26574: getting the remaining hosts for this loop 37031 1727204380.26576: done getting the remaining hosts for this loop 37031 1727204380.26580: getting the next task for host managed-node2 37031 1727204380.26588: done getting next task for host managed-node2 37031 1727204380.26592: ^ task is: TASK: Set network provider to 'nm' 37031 1727204380.26594: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.26598: getting variables 37031 1727204380.26600: in VariableManager get_vars() 37031 1727204380.26629: Calling all_inventory to load vars for managed-node2 37031 1727204380.26631: Calling groups_inventory to load vars for managed-node2 37031 1727204380.26635: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.26647: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.26650: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.26653: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.26881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.27068: done with get_vars() 37031 1727204380.27080: done getting variables 37031 1727204380.27152: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.022) 0:00:02.817 ***** 37031 1727204380.27286: entering _queue_task() for managed-node2/set_fact 37031 1727204380.27490: worker is 1 (out of 1 available) 37031 1727204380.27501: exiting _queue_task() for managed-node2/set_fact 37031 1727204380.27512: done queuing things up, now waiting for results queue to drain 37031 1727204380.27514: waiting for pending results... 37031 1727204380.27746: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 37031 1727204380.27820: in run() - task 0affcd87-79f5-b754-dfb8-000000000007 37031 1727204380.27831: variable 'ansible_search_path' from source: unknown 37031 1727204380.27868: calling self._execute() 37031 1727204380.27932: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.27935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.27944: variable 'omit' from source: magic vars 37031 1727204380.28035: variable 'omit' from source: magic vars 37031 1727204380.28062: variable 'omit' from source: magic vars 37031 1727204380.28101: variable 'omit' from source: magic vars 37031 1727204380.28141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204380.28178: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204380.28204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204380.28222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204380.28232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204380.28262: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204380.28268: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.28271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.28370: Set connection var ansible_connection to ssh 37031 1727204380.28373: Set connection var ansible_shell_type to sh 37031 1727204380.28380: Set connection var ansible_pipelining to False 37031 1727204380.28388: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204380.28394: Set connection var ansible_timeout to 10 37031 1727204380.28399: Set connection var ansible_shell_executable to /bin/sh 37031 1727204380.28432: variable 'ansible_shell_executable' from source: unknown 37031 1727204380.28435: variable 'ansible_connection' from source: unknown 37031 1727204380.28437: variable 'ansible_module_compression' from source: unknown 37031 1727204380.28442: variable 'ansible_shell_type' from source: unknown 37031 1727204380.28444: variable 'ansible_shell_executable' from source: unknown 37031 1727204380.28446: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.28448: variable 'ansible_pipelining' from source: unknown 37031 1727204380.28450: variable 'ansible_timeout' from source: unknown 37031 1727204380.28456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.28600: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204380.28610: variable 'omit' from source: magic vars 37031 1727204380.28615: starting attempt loop 37031 1727204380.28618: running the handler 37031 1727204380.28652: handler run complete 37031 1727204380.28659: attempt loop complete, returning result 37031 1727204380.28661: _execute() done 37031 1727204380.28665: dumping result to json 37031 1727204380.28667: done dumping result, returning 37031 1727204380.28670: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [0affcd87-79f5-b754-dfb8-000000000007] 37031 1727204380.28672: sending task result for task 0affcd87-79f5-b754-dfb8-000000000007 37031 1727204380.28743: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000007 37031 1727204380.28747: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 37031 1727204380.28809: no more pending results, returning what we have 37031 1727204380.28813: results queue empty 37031 1727204380.28814: checking for any_errors_fatal 37031 1727204380.28820: done checking for any_errors_fatal 37031 1727204380.28821: checking for max_fail_percentage 37031 1727204380.28823: done checking for max_fail_percentage 37031 1727204380.28824: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.28825: done checking to see if all hosts have failed 37031 1727204380.28825: getting the remaining hosts for this loop 37031 1727204380.28827: done getting the remaining hosts for this loop 37031 1727204380.28830: getting the next task for host managed-node2 37031 1727204380.28837: done getting next task for host managed-node2 37031 1727204380.28839: ^ task is: TASK: meta (flush_handlers) 37031 1727204380.28841: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.28846: getting variables 37031 1727204380.28848: in VariableManager get_vars() 37031 1727204380.28877: Calling all_inventory to load vars for managed-node2 37031 1727204380.28880: Calling groups_inventory to load vars for managed-node2 37031 1727204380.28883: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.28893: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.28896: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.28898: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.29065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.29251: done with get_vars() 37031 1727204380.29260: done getting variables 37031 1727204380.29328: in VariableManager get_vars() 37031 1727204380.29337: Calling all_inventory to load vars for managed-node2 37031 1727204380.29339: Calling groups_inventory to load vars for managed-node2 37031 1727204380.29341: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.29346: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.29348: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.29351: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.29616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.29823: done with get_vars() 37031 1727204380.29836: done queuing things up, now waiting for results queue to drain 37031 1727204380.29838: results queue empty 37031 1727204380.29839: checking for any_errors_fatal 37031 1727204380.29841: done checking for any_errors_fatal 37031 1727204380.29841: checking for max_fail_percentage 37031 1727204380.29842: done checking for max_fail_percentage 37031 1727204380.29843: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.29844: done checking to see if all hosts have failed 37031 1727204380.29845: getting the remaining hosts for this loop 37031 1727204380.29845: done getting the remaining hosts for this loop 37031 1727204380.29848: getting the next task for host managed-node2 37031 1727204380.29851: done getting next task for host managed-node2 37031 1727204380.29853: ^ task is: TASK: meta (flush_handlers) 37031 1727204380.29854: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.29861: getting variables 37031 1727204380.29862: in VariableManager get_vars() 37031 1727204380.29871: Calling all_inventory to load vars for managed-node2 37031 1727204380.29873: Calling groups_inventory to load vars for managed-node2 37031 1727204380.29876: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.29880: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.29882: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.29885: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.30013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.30749: done with get_vars() 37031 1727204380.30756: done getting variables 37031 1727204380.30804: in VariableManager get_vars() 37031 1727204380.30812: Calling all_inventory to load vars for managed-node2 37031 1727204380.30814: Calling groups_inventory to load vars for managed-node2 37031 1727204380.30816: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.30821: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.30823: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.30826: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.30962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.31141: done with get_vars() 37031 1727204380.31152: done queuing things up, now waiting for results queue to drain 37031 1727204380.31154: results queue empty 37031 1727204380.31155: checking for any_errors_fatal 37031 1727204380.31156: done checking for any_errors_fatal 37031 1727204380.31157: checking for max_fail_percentage 37031 1727204380.31158: done checking for max_fail_percentage 37031 1727204380.31158: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.31159: done checking to see if all hosts have failed 37031 1727204380.31160: getting the remaining hosts for this loop 37031 1727204380.31161: done getting the remaining hosts for this loop 37031 1727204380.31165: getting the next task for host managed-node2 37031 1727204380.31168: done getting next task for host managed-node2 37031 1727204380.31168: ^ task is: None 37031 1727204380.31170: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.31171: done queuing things up, now waiting for results queue to drain 37031 1727204380.31172: results queue empty 37031 1727204380.31173: checking for any_errors_fatal 37031 1727204380.31173: done checking for any_errors_fatal 37031 1727204380.31174: checking for max_fail_percentage 37031 1727204380.31175: done checking for max_fail_percentage 37031 1727204380.31175: checking to see if all hosts have failed and the running result is not ok 37031 1727204380.31176: done checking to see if all hosts have failed 37031 1727204380.31178: getting the next task for host managed-node2 37031 1727204380.31180: done getting next task for host managed-node2 37031 1727204380.31181: ^ task is: None 37031 1727204380.31182: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.31438: in VariableManager get_vars() 37031 1727204380.31494: done with get_vars() 37031 1727204380.31501: in VariableManager get_vars() 37031 1727204380.31516: done with get_vars() 37031 1727204380.31520: variable 'omit' from source: magic vars 37031 1727204380.31552: in VariableManager get_vars() 37031 1727204380.31572: done with get_vars() 37031 1727204380.31603: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 37031 1727204380.31998: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 37031 1727204380.32022: getting the remaining hosts for this loop 37031 1727204380.32024: done getting the remaining hosts for this loop 37031 1727204380.32026: getting the next task for host managed-node2 37031 1727204380.32028: done getting next task for host managed-node2 37031 1727204380.32030: ^ task is: TASK: Gathering Facts 37031 1727204380.32033: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204380.32034: getting variables 37031 1727204380.32035: in VariableManager get_vars() 37031 1727204380.32048: Calling all_inventory to load vars for managed-node2 37031 1727204380.32050: Calling groups_inventory to load vars for managed-node2 37031 1727204380.32052: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204380.32056: Calling all_plugins_play to load vars for managed-node2 37031 1727204380.32071: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204380.32075: Calling groups_plugins_play to load vars for managed-node2 37031 1727204380.32204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204380.32381: done with get_vars() 37031 1727204380.32388: done getting variables 37031 1727204380.32424: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Tuesday 24 September 2024 14:59:40 -0400 (0:00:00.051) 0:00:02.869 ***** 37031 1727204380.32447: entering _queue_task() for managed-node2/gather_facts 37031 1727204380.32688: worker is 1 (out of 1 available) 37031 1727204380.32699: exiting _queue_task() for managed-node2/gather_facts 37031 1727204380.32712: done queuing things up, now waiting for results queue to drain 37031 1727204380.32713: waiting for pending results... 37031 1727204380.32951: running TaskExecutor() for managed-node2/TASK: Gathering Facts 37031 1727204380.33048: in run() - task 0affcd87-79f5-b754-dfb8-000000000115 37031 1727204380.33069: variable 'ansible_search_path' from source: unknown 37031 1727204380.33108: calling self._execute() 37031 1727204380.33187: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.33197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.33208: variable 'omit' from source: magic vars 37031 1727204380.33547: variable 'ansible_distribution_major_version' from source: facts 37031 1727204380.33565: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204380.33576: variable 'omit' from source: magic vars 37031 1727204380.33608: variable 'omit' from source: magic vars 37031 1727204380.33644: variable 'omit' from source: magic vars 37031 1727204380.33687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204380.33727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204380.33749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204380.33773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204380.33789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204380.33823: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204380.33832: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.33838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.33935: Set connection var ansible_connection to ssh 37031 1727204380.33943: Set connection var ansible_shell_type to sh 37031 1727204380.33953: Set connection var ansible_pipelining to False 37031 1727204380.33968: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204380.33977: Set connection var ansible_timeout to 10 37031 1727204380.33986: Set connection var ansible_shell_executable to /bin/sh 37031 1727204380.34015: variable 'ansible_shell_executable' from source: unknown 37031 1727204380.34025: variable 'ansible_connection' from source: unknown 37031 1727204380.34032: variable 'ansible_module_compression' from source: unknown 37031 1727204380.34038: variable 'ansible_shell_type' from source: unknown 37031 1727204380.34044: variable 'ansible_shell_executable' from source: unknown 37031 1727204380.34050: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204380.34056: variable 'ansible_pipelining' from source: unknown 37031 1727204380.34063: variable 'ansible_timeout' from source: unknown 37031 1727204380.34072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204380.34245: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204380.34261: variable 'omit' from source: magic vars 37031 1727204380.34272: starting attempt loop 37031 1727204380.34278: running the handler 37031 1727204380.34295: variable 'ansible_facts' from source: unknown 37031 1727204380.34317: _low_level_execute_command(): starting 37031 1727204380.34328: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204380.35078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204380.35095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.35113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.35130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.35170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.35183: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204380.35195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.35214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204380.35225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204380.35235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204380.35246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.35257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.35275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.35285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.35294: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204380.35305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.35387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204380.35413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204380.35433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204380.35514: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204380.37779: stdout chunk (state=3): >>>/root <<< 37031 1727204380.37929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204380.38025: stderr chunk (state=3): >>><<< 37031 1727204380.38038: stdout chunk (state=3): >>><<< 37031 1727204380.38175: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204380.38178: _low_level_execute_command(): starting 37031 1727204380.38182: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506 `" && echo ansible-tmp-1727204380.3807957-37297-275643783748506="` echo /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506 `" ) && sleep 0' 37031 1727204380.38794: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204380.38807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.38828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.38847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.38891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.38902: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204380.38915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.38938: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204380.38950: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204380.38960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204380.38974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.38986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.39000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.39010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.39020: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204380.39031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.39117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204380.39138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204380.39160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204380.39239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204380.41891: stdout chunk (state=3): >>>ansible-tmp-1727204380.3807957-37297-275643783748506=/root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506 <<< 37031 1727204380.42081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204380.42109: stderr chunk (state=3): >>><<< 37031 1727204380.42112: stdout chunk (state=3): >>><<< 37031 1727204380.42131: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204380.3807957-37297-275643783748506=/root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204380.42160: variable 'ansible_module_compression' from source: unknown 37031 1727204380.42201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 37031 1727204380.42251: variable 'ansible_facts' from source: unknown 37031 1727204380.42372: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/AnsiballZ_setup.py 37031 1727204380.42490: Sending initial data 37031 1727204380.42494: Sent initial data (154 bytes) 37031 1727204380.43195: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.43199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.43244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.43248: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.43250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.43252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.43257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.43309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204380.43312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204380.43318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204380.43370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204380.45789: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204380.45824: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204380.45866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmptufeggxb /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/AnsiballZ_setup.py <<< 37031 1727204380.45899: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204380.48639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204380.48879: stderr chunk (state=3): >>><<< 37031 1727204380.48882: stdout chunk (state=3): >>><<< 37031 1727204380.48885: done transferring module to remote 37031 1727204380.48887: _low_level_execute_command(): starting 37031 1727204380.48889: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/ /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/AnsiballZ_setup.py && sleep 0' 37031 1727204380.49567: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204380.49582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.49601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.49618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.49676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.49688: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204380.49702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.49719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204380.49730: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204380.49751: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204380.49770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.49783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.49798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.49809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.49819: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204380.49831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.49921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204380.49941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204380.49965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204380.50045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204380.52606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204380.52621: stderr chunk (state=3): >>><<< 37031 1727204380.52625: stdout chunk (state=3): >>><<< 37031 1727204380.52642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204380.52739: _low_level_execute_command(): starting 37031 1727204380.52743: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/AnsiballZ_setup.py && sleep 0' 37031 1727204380.53334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204380.53347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.53361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.53381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.53432: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.53443: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204380.53456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.53475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204380.53486: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204380.53501: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204380.53518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204380.53530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204380.53544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204380.53559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204380.53574: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204380.53587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204380.53676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204380.53696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204380.53713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204380.53801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.23099: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2757, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 775, "free": 2757}, "nocache": {"free": 3233, "used": 299}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansib<<< 37031 1727204381.23107: stdout chunk (state=3): >>>le_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 743, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264270917632, "block_size": 4096, "block_total": 65519355, "block_available": 64519267, "block_used": 1000088, "inode_total": 131071472, "inode_available": 130998222, "inode_used": 73250, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8<<< 37031 1727204381.23129: stdout chunk (state=3): >>>I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "59", "second": "41", "epoch": "1727204381", "epoch_int": "1727204381", "date": "2024-09-24", "time": "14:59:41", "iso8601_micro": "2024-09-24T18:59:41.165279Z", "iso8601": "2024-09-24T18:59:41Z", "iso8601_basic": "20240924T145941165279", "iso8601_basic_short": "20240924T145941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.76, "5m": 0.6, "15m": 0.31}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmen<<< 37031 1727204381.23713: stdout chunk (state=3): >>>tation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 37031 1727204381.25481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204381.25577: stderr chunk (state=3): >>><<< 37031 1727204381.25581: stdout chunk (state=3): >>><<< 37031 1727204381.25675: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "e28ab0e542474a869c23f7ace4640799", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2757, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 775, "free": 2757}, "nocache": {"free": 3233, "used": 299}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_uuid": "ec243623-fa66-7445-44ba-1070930583a9", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 743, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264270917632, "block_size": 4096, "block_total": 65519355, "block_available": 64519267, "block_used": 1000088, "inode_total": 131071472, "inode_available": 130998222, "inode_used": 73250, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPleAC0mV69PNpLSbmzZvoLD9LsCBzX6IHRLXV1uktk0r66T6Y57EoVgflJTdo6yU0zTaJjonNzFmvC69tiRsCyywGjnvnBOvIH2vrgNGCUdVYPZbbtmQlJvol7NFFfyXQR4RSPqBKT67rYbCzbETM4j+bdDgTeDk6l7wXwz9RVvAAAAFQCuAyyjbOBDKyIW26LGcI9/nmWpHwAAAIEApIE1W6KQ7qs5kJXBdSaPoWaZUxuQhXkPWORFe7/MBn5SojDfxvJjFPo6t4QsovaCnm532Zghh1ZdB0pNm0vYcRbz3wMdfMucw/KHWt6ZEtI+sLwuMyhAVEXzmE34iXkyePtELiYzY6NyxuJ04IujI9UwD7ZnqFBHVFz529oXikIAAACBAPdUu+4Qo82CMcmrGD9vNUgtsts6GCjqBDuov8GJEALZ9ZNLlyVoNtBHLMQH9e0czLygyNGw/IDosRQkKdX4Vh4A7KXujTIOyytaN4JVJCuOBY/PeX4lreAO/UTTUJ27yT/J0Oy2Hbt+d8fZnTkZReRNPFCzvdb1nuPMG5nAyQtL", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCzkKXWiNuOrU77QQcZuT2T9XVh655Sh8Sv9vLWLa1uj7ceaNsB0TBiqvDFvYPENhdKceYaGAFU7sjqbmp5dlivYwPBiBWvcOgqnpBqrMG5SvP1RMiORpW6GupBLnUaMVjopPLIi0/CDlSl2eODcEnQI6BpxCCSedEKU9UrRrCFJy+6KPQXepPwKwPTd1TMzO8wpo57B5MYrjnquTNxMfgBkYsHB/V77d0tKq8qGBTkAPD8wEWLIcZOI+SyYEfCraQ95dOGAPRTFijnd7S15CugSlJ/vvcHSFXOlbgFzeNnU2jZneagkBfaOJch72opD3ebISSHCx1/kJvHN7MbksI+ljJa3Nw5LwP1XjUpT7dQMOZJDdVStXKp86K4XpWud+wMbQVVyU5QoFsCl7YTWWmSDRiPJOQI2myfizCT8i42rJ0WXm5OnqpHn1Jw4nGlcVnfgPQA/zxMldzReXdHnvriqKC9+97XgY6pj42YYP78PhOu1D2xH1AXmloNM+63VvU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPT1h7wNcUomxtav688iXvnCnFqrHnEKf4gRaBY3w4BwbWOGxE8hq5snF9Tp+0agFeN/u980/y8BJWdWIO9Lz8I=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPe8liWy3mh5GzCz9W616J2ArXnLOjLOZSwfmBX3Q1SI", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 48676 10.31.13.78 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 48676 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "59", "second": "41", "epoch": "1727204381", "epoch_int": "1727204381", "date": "2024-09-24", "time": "14:59:41", "iso8601_micro": "2024-09-24T18:59:41.165279Z", "iso8601": "2024-09-24T18:59:41Z", "iso8601_basic": "20240924T145941165279", "iso8601_basic_short": "20240924T145941", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.76, "5m": 0.6, "15m": 0.31}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ffff:fef5:f2b9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.13.78", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:ff:f5:f2:b9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.13.78"], "ansible_all_ipv6_addresses": ["fe80::8ff:ffff:fef5:f2b9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.13.78", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ffff:fef5:f2b9"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204381.26041: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204381.26075: _low_level_execute_command(): starting 37031 1727204381.26094: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204380.3807957-37297-275643783748506/ > /dev/null 2>&1 && sleep 0' 37031 1727204381.26831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204381.26849: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.26878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.26897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.26941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204381.26957: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204381.26983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.27002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204381.27015: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204381.27026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204381.27038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.27052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.27077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.27099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204381.27112: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204381.27126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.27215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204381.27239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204381.27260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.27350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.29851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.29988: stderr chunk (state=3): >>><<< 37031 1727204381.30000: stdout chunk (state=3): >>><<< 37031 1727204381.30177: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204381.30181: handler run complete 37031 1727204381.30183: variable 'ansible_facts' from source: unknown 37031 1727204381.30314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.30758: variable 'ansible_facts' from source: unknown 37031 1727204381.30882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.31037: attempt loop complete, returning result 37031 1727204381.31056: _execute() done 37031 1727204381.31068: dumping result to json 37031 1727204381.31106: done dumping result, returning 37031 1727204381.31119: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [0affcd87-79f5-b754-dfb8-000000000115] 37031 1727204381.31134: sending task result for task 0affcd87-79f5-b754-dfb8-000000000115 ok: [managed-node2] 37031 1727204381.31832: no more pending results, returning what we have 37031 1727204381.31835: results queue empty 37031 1727204381.31836: checking for any_errors_fatal 37031 1727204381.31837: done checking for any_errors_fatal 37031 1727204381.31838: checking for max_fail_percentage 37031 1727204381.31839: done checking for max_fail_percentage 37031 1727204381.31840: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.31841: done checking to see if all hosts have failed 37031 1727204381.31842: getting the remaining hosts for this loop 37031 1727204381.31844: done getting the remaining hosts for this loop 37031 1727204381.31848: getting the next task for host managed-node2 37031 1727204381.31857: done getting next task for host managed-node2 37031 1727204381.31859: ^ task is: TASK: meta (flush_handlers) 37031 1727204381.31861: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.31890: getting variables 37031 1727204381.31892: in VariableManager get_vars() 37031 1727204381.31928: Calling all_inventory to load vars for managed-node2 37031 1727204381.31931: Calling groups_inventory to load vars for managed-node2 37031 1727204381.31933: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.31944: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.31947: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.31950: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.32157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.32547: done with get_vars() 37031 1727204381.32560: done getting variables 37031 1727204381.32752: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000115 37031 1727204381.32758: WORKER PROCESS EXITING 37031 1727204381.32816: in VariableManager get_vars() 37031 1727204381.32832: Calling all_inventory to load vars for managed-node2 37031 1727204381.32835: Calling groups_inventory to load vars for managed-node2 37031 1727204381.32837: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.32842: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.32844: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.32871: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.33088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.33315: done with get_vars() 37031 1727204381.33328: done queuing things up, now waiting for results queue to drain 37031 1727204381.33330: results queue empty 37031 1727204381.33331: checking for any_errors_fatal 37031 1727204381.33334: done checking for any_errors_fatal 37031 1727204381.33335: checking for max_fail_percentage 37031 1727204381.33336: done checking for max_fail_percentage 37031 1727204381.33337: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.33338: done checking to see if all hosts have failed 37031 1727204381.33339: getting the remaining hosts for this loop 37031 1727204381.33340: done getting the remaining hosts for this loop 37031 1727204381.33342: getting the next task for host managed-node2 37031 1727204381.33346: done getting next task for host managed-node2 37031 1727204381.33349: ^ task is: TASK: Include the task 'show_interfaces.yml' 37031 1727204381.33350: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.33355: getting variables 37031 1727204381.33356: in VariableManager get_vars() 37031 1727204381.33371: Calling all_inventory to load vars for managed-node2 37031 1727204381.33373: Calling groups_inventory to load vars for managed-node2 37031 1727204381.33375: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.33379: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.33382: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.33384: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.33542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.33763: done with get_vars() 37031 1727204381.33784: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Tuesday 24 September 2024 14:59:41 -0400 (0:00:01.014) 0:00:03.883 ***** 37031 1727204381.33881: entering _queue_task() for managed-node2/include_tasks 37031 1727204381.34230: worker is 1 (out of 1 available) 37031 1727204381.34241: exiting _queue_task() for managed-node2/include_tasks 37031 1727204381.34256: done queuing things up, now waiting for results queue to drain 37031 1727204381.34257: waiting for pending results... 37031 1727204381.34637: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 37031 1727204381.34797: in run() - task 0affcd87-79f5-b754-dfb8-00000000000b 37031 1727204381.34816: variable 'ansible_search_path' from source: unknown 37031 1727204381.34880: calling self._execute() 37031 1727204381.34973: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.34984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.34996: variable 'omit' from source: magic vars 37031 1727204381.35476: variable 'ansible_distribution_major_version' from source: facts 37031 1727204381.35506: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204381.35516: _execute() done 37031 1727204381.35524: dumping result to json 37031 1727204381.35532: done dumping result, returning 37031 1727204381.35540: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-b754-dfb8-00000000000b] 37031 1727204381.35548: sending task result for task 0affcd87-79f5-b754-dfb8-00000000000b 37031 1727204381.35696: no more pending results, returning what we have 37031 1727204381.35702: in VariableManager get_vars() 37031 1727204381.35750: Calling all_inventory to load vars for managed-node2 37031 1727204381.35755: Calling groups_inventory to load vars for managed-node2 37031 1727204381.35758: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.35786: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.35789: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.35792: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.36039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.36246: done with get_vars() 37031 1727204381.36255: variable 'ansible_search_path' from source: unknown 37031 1727204381.36272: we have included files to process 37031 1727204381.36273: generating all_blocks data 37031 1727204381.36275: done generating all_blocks data 37031 1727204381.36276: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204381.36277: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204381.36279: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204381.36711: in VariableManager get_vars() 37031 1727204381.36731: done with get_vars() 37031 1727204381.36781: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000000b 37031 1727204381.36785: WORKER PROCESS EXITING 37031 1727204381.36876: done processing included file 37031 1727204381.36877: iterating over new_blocks loaded from include file 37031 1727204381.36879: in VariableManager get_vars() 37031 1727204381.36890: done with get_vars() 37031 1727204381.36891: filtering new block on tags 37031 1727204381.36902: done filtering new block on tags 37031 1727204381.36903: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 37031 1727204381.36906: extending task lists for all hosts with included blocks 37031 1727204381.36945: done extending task lists 37031 1727204381.36946: done processing included files 37031 1727204381.36946: results queue empty 37031 1727204381.36946: checking for any_errors_fatal 37031 1727204381.36947: done checking for any_errors_fatal 37031 1727204381.36948: checking for max_fail_percentage 37031 1727204381.36949: done checking for max_fail_percentage 37031 1727204381.36949: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.36950: done checking to see if all hosts have failed 37031 1727204381.36950: getting the remaining hosts for this loop 37031 1727204381.36951: done getting the remaining hosts for this loop 37031 1727204381.36953: getting the next task for host managed-node2 37031 1727204381.36958: done getting next task for host managed-node2 37031 1727204381.36966: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 37031 1727204381.36970: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.36973: getting variables 37031 1727204381.36973: in VariableManager get_vars() 37031 1727204381.36982: Calling all_inventory to load vars for managed-node2 37031 1727204381.36984: Calling groups_inventory to load vars for managed-node2 37031 1727204381.36985: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.36988: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.36989: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.36991: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.37093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.37202: done with get_vars() 37031 1727204381.37208: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:59:41 -0400 (0:00:00.033) 0:00:03.917 ***** 37031 1727204381.37257: entering _queue_task() for managed-node2/include_tasks 37031 1727204381.37440: worker is 1 (out of 1 available) 37031 1727204381.37452: exiting _queue_task() for managed-node2/include_tasks 37031 1727204381.37467: done queuing things up, now waiting for results queue to drain 37031 1727204381.37469: waiting for pending results... 37031 1727204381.37620: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 37031 1727204381.37685: in run() - task 0affcd87-79f5-b754-dfb8-00000000012b 37031 1727204381.37695: variable 'ansible_search_path' from source: unknown 37031 1727204381.37699: variable 'ansible_search_path' from source: unknown 37031 1727204381.37727: calling self._execute() 37031 1727204381.37794: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.37799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.37806: variable 'omit' from source: magic vars 37031 1727204381.38071: variable 'ansible_distribution_major_version' from source: facts 37031 1727204381.38085: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204381.38090: _execute() done 37031 1727204381.38093: dumping result to json 37031 1727204381.38096: done dumping result, returning 37031 1727204381.38102: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-b754-dfb8-00000000012b] 37031 1727204381.38107: sending task result for task 0affcd87-79f5-b754-dfb8-00000000012b 37031 1727204381.38187: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000012b 37031 1727204381.38189: WORKER PROCESS EXITING 37031 1727204381.38213: no more pending results, returning what we have 37031 1727204381.38217: in VariableManager get_vars() 37031 1727204381.38259: Calling all_inventory to load vars for managed-node2 37031 1727204381.38261: Calling groups_inventory to load vars for managed-node2 37031 1727204381.38266: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.38276: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.38279: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.38282: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.38402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.38566: done with get_vars() 37031 1727204381.38573: variable 'ansible_search_path' from source: unknown 37031 1727204381.38574: variable 'ansible_search_path' from source: unknown 37031 1727204381.38628: we have included files to process 37031 1727204381.38629: generating all_blocks data 37031 1727204381.38631: done generating all_blocks data 37031 1727204381.38637: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204381.38639: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204381.38641: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204381.39004: done processing included file 37031 1727204381.39006: iterating over new_blocks loaded from include file 37031 1727204381.39007: in VariableManager get_vars() 37031 1727204381.39024: done with get_vars() 37031 1727204381.39025: filtering new block on tags 37031 1727204381.39052: done filtering new block on tags 37031 1727204381.39057: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 37031 1727204381.39061: extending task lists for all hosts with included blocks 37031 1727204381.39173: done extending task lists 37031 1727204381.39175: done processing included files 37031 1727204381.39176: results queue empty 37031 1727204381.39176: checking for any_errors_fatal 37031 1727204381.39179: done checking for any_errors_fatal 37031 1727204381.39179: checking for max_fail_percentage 37031 1727204381.39180: done checking for max_fail_percentage 37031 1727204381.39181: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.39182: done checking to see if all hosts have failed 37031 1727204381.39183: getting the remaining hosts for this loop 37031 1727204381.39184: done getting the remaining hosts for this loop 37031 1727204381.39186: getting the next task for host managed-node2 37031 1727204381.39190: done getting next task for host managed-node2 37031 1727204381.39192: ^ task is: TASK: Gather current interface info 37031 1727204381.39195: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.39197: getting variables 37031 1727204381.39198: in VariableManager get_vars() 37031 1727204381.39210: Calling all_inventory to load vars for managed-node2 37031 1727204381.39212: Calling groups_inventory to load vars for managed-node2 37031 1727204381.39214: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.39218: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.39220: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.39223: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.39507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.39695: done with get_vars() 37031 1727204381.39703: done getting variables 37031 1727204381.39735: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:59:41 -0400 (0:00:00.024) 0:00:03.942 ***** 37031 1727204381.39756: entering _queue_task() for managed-node2/command 37031 1727204381.39932: worker is 1 (out of 1 available) 37031 1727204381.39945: exiting _queue_task() for managed-node2/command 37031 1727204381.39957: done queuing things up, now waiting for results queue to drain 37031 1727204381.39958: waiting for pending results... 37031 1727204381.40112: running TaskExecutor() for managed-node2/TASK: Gather current interface info 37031 1727204381.40185: in run() - task 0affcd87-79f5-b754-dfb8-00000000013a 37031 1727204381.40194: variable 'ansible_search_path' from source: unknown 37031 1727204381.40198: variable 'ansible_search_path' from source: unknown 37031 1727204381.40224: calling self._execute() 37031 1727204381.40286: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.40290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.40297: variable 'omit' from source: magic vars 37031 1727204381.40551: variable 'ansible_distribution_major_version' from source: facts 37031 1727204381.40563: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204381.40569: variable 'omit' from source: magic vars 37031 1727204381.40605: variable 'omit' from source: magic vars 37031 1727204381.40629: variable 'omit' from source: magic vars 37031 1727204381.40661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204381.40690: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204381.40710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204381.40722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204381.40734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204381.40758: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204381.40761: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.40765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.40836: Set connection var ansible_connection to ssh 37031 1727204381.40840: Set connection var ansible_shell_type to sh 37031 1727204381.40846: Set connection var ansible_pipelining to False 37031 1727204381.40855: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204381.40858: Set connection var ansible_timeout to 10 37031 1727204381.40863: Set connection var ansible_shell_executable to /bin/sh 37031 1727204381.40884: variable 'ansible_shell_executable' from source: unknown 37031 1727204381.40887: variable 'ansible_connection' from source: unknown 37031 1727204381.40890: variable 'ansible_module_compression' from source: unknown 37031 1727204381.40894: variable 'ansible_shell_type' from source: unknown 37031 1727204381.40898: variable 'ansible_shell_executable' from source: unknown 37031 1727204381.40904: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.40910: variable 'ansible_pipelining' from source: unknown 37031 1727204381.40913: variable 'ansible_timeout' from source: unknown 37031 1727204381.40917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.41019: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204381.41033: variable 'omit' from source: magic vars 37031 1727204381.41037: starting attempt loop 37031 1727204381.41040: running the handler 37031 1727204381.41051: _low_level_execute_command(): starting 37031 1727204381.41060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204381.41770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.41790: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 37031 1727204381.41835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204381.41839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.41897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.44121: stdout chunk (state=3): >>>/root <<< 37031 1727204381.44299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.44341: stderr chunk (state=3): >>><<< 37031 1727204381.44345: stdout chunk (state=3): >>><<< 37031 1727204381.44368: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204381.44382: _low_level_execute_command(): starting 37031 1727204381.44392: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462 `" && echo ansible-tmp-1727204381.4437082-37337-160953434531462="` echo /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462 `" ) && sleep 0' 37031 1727204381.45130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.45279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.45296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.47861: stdout chunk (state=3): >>>ansible-tmp-1727204381.4437082-37337-160953434531462=/root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462 <<< 37031 1727204381.48056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.48071: stderr chunk (state=3): >>><<< 37031 1727204381.48075: stdout chunk (state=3): >>><<< 37031 1727204381.48091: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204381.4437082-37337-160953434531462=/root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204381.48116: variable 'ansible_module_compression' from source: unknown 37031 1727204381.48160: ANSIBALLZ: Using generic lock for ansible.legacy.command 37031 1727204381.48165: ANSIBALLZ: Acquiring lock 37031 1727204381.48168: ANSIBALLZ: Lock acquired: 140694173153808 37031 1727204381.48170: ANSIBALLZ: Creating module 37031 1727204381.56821: ANSIBALLZ: Writing module into payload 37031 1727204381.56899: ANSIBALLZ: Writing module 37031 1727204381.56919: ANSIBALLZ: Renaming module 37031 1727204381.56923: ANSIBALLZ: Done creating module 37031 1727204381.56938: variable 'ansible_facts' from source: unknown 37031 1727204381.56995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/AnsiballZ_command.py 37031 1727204381.57099: Sending initial data 37031 1727204381.57103: Sent initial data (156 bytes) 37031 1727204381.57815: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.57819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.57856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.57860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.57862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.57908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204381.57920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.57985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.60437: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204381.60477: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204381.60520: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpdeu1rzgm /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/AnsiballZ_command.py <<< 37031 1727204381.60565: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204381.61393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.61507: stderr chunk (state=3): >>><<< 37031 1727204381.61510: stdout chunk (state=3): >>><<< 37031 1727204381.61526: done transferring module to remote 37031 1727204381.61536: _low_level_execute_command(): starting 37031 1727204381.61542: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/ /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/AnsiballZ_command.py && sleep 0' 37031 1727204381.62025: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.62029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.62068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204381.62072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.62074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.62133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204381.62136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204381.62142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.62183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.64656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.64717: stderr chunk (state=3): >>><<< 37031 1727204381.64720: stdout chunk (state=3): >>><<< 37031 1727204381.64734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204381.64742: _low_level_execute_command(): starting 37031 1727204381.64744: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/AnsiballZ_command.py && sleep 0' 37031 1727204381.65200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.65208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.65245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204381.65257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.65370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.65402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.86448: stdout chunk (state=3): >>> <<< 37031 1727204381.86457: stdout chunk (state=3): >>>{"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:59:41.858898", "end": "2024-09-24 14:59:41.863281", "delta": "0:00:00.004383", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204381.88277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204381.88281: stdout chunk (state=3): >>><<< 37031 1727204381.88287: stderr chunk (state=3): >>><<< 37031 1727204381.88316: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:59:41.858898", "end": "2024-09-24 14:59:41.863281", "delta": "0:00:00.004383", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204381.88353: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204381.88366: _low_level_execute_command(): starting 37031 1727204381.88371: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204381.4437082-37337-160953434531462/ > /dev/null 2>&1 && sleep 0' 37031 1727204381.89269: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204381.89272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.89274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.89276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.89278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204381.89280: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204381.89282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.89284: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204381.89286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204381.89288: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204381.89289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204381.89291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204381.89293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204381.89295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204381.89297: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204381.89299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204381.89322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204381.89345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204381.89359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204381.89430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204381.92087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204381.92106: stderr chunk (state=3): >>><<< 37031 1727204381.92109: stdout chunk (state=3): >>><<< 37031 1727204381.92129: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204381.92137: handler run complete 37031 1727204381.92169: Evaluated conditional (False): False 37031 1727204381.92178: attempt loop complete, returning result 37031 1727204381.92181: _execute() done 37031 1727204381.92185: dumping result to json 37031 1727204381.92189: done dumping result, returning 37031 1727204381.92200: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-b754-dfb8-00000000013a] 37031 1727204381.92203: sending task result for task 0affcd87-79f5-b754-dfb8-00000000013a 37031 1727204381.92312: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000013a 37031 1727204381.92315: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.004383", "end": "2024-09-24 14:59:41.863281", "rc": 0, "start": "2024-09-24 14:59:41.858898" } STDOUT: bonding_masters eth0 lo 37031 1727204381.92382: no more pending results, returning what we have 37031 1727204381.92385: results queue empty 37031 1727204381.92386: checking for any_errors_fatal 37031 1727204381.92388: done checking for any_errors_fatal 37031 1727204381.92388: checking for max_fail_percentage 37031 1727204381.92390: done checking for max_fail_percentage 37031 1727204381.92391: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.92392: done checking to see if all hosts have failed 37031 1727204381.92393: getting the remaining hosts for this loop 37031 1727204381.92394: done getting the remaining hosts for this loop 37031 1727204381.92398: getting the next task for host managed-node2 37031 1727204381.92404: done getting next task for host managed-node2 37031 1727204381.92407: ^ task is: TASK: Set current_interfaces 37031 1727204381.92410: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.92413: getting variables 37031 1727204381.92414: in VariableManager get_vars() 37031 1727204381.92455: Calling all_inventory to load vars for managed-node2 37031 1727204381.92458: Calling groups_inventory to load vars for managed-node2 37031 1727204381.92460: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.92471: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.92473: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.92475: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.92627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.92821: done with get_vars() 37031 1727204381.92833: done getting variables 37031 1727204381.92901: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:59:41 -0400 (0:00:00.531) 0:00:04.474 ***** 37031 1727204381.92932: entering _queue_task() for managed-node2/set_fact 37031 1727204381.93529: worker is 1 (out of 1 available) 37031 1727204381.93541: exiting _queue_task() for managed-node2/set_fact 37031 1727204381.93556: done queuing things up, now waiting for results queue to drain 37031 1727204381.93557: waiting for pending results... 37031 1727204381.94271: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 37031 1727204381.94568: in run() - task 0affcd87-79f5-b754-dfb8-00000000013b 37031 1727204381.94588: variable 'ansible_search_path' from source: unknown 37031 1727204381.94596: variable 'ansible_search_path' from source: unknown 37031 1727204381.94636: calling self._execute() 37031 1727204381.94745: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.94899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.94913: variable 'omit' from source: magic vars 37031 1727204381.95799: variable 'ansible_distribution_major_version' from source: facts 37031 1727204381.95815: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204381.95824: variable 'omit' from source: magic vars 37031 1727204381.95989: variable 'omit' from source: magic vars 37031 1727204381.96184: variable '_current_interfaces' from source: set_fact 37031 1727204381.96251: variable 'omit' from source: magic vars 37031 1727204381.96347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204381.96405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204381.96545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204381.96574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204381.96591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204381.96622: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204381.96636: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.96675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.96895: Set connection var ansible_connection to ssh 37031 1727204381.96904: Set connection var ansible_shell_type to sh 37031 1727204381.96915: Set connection var ansible_pipelining to False 37031 1727204381.96927: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204381.96936: Set connection var ansible_timeout to 10 37031 1727204381.96971: Set connection var ansible_shell_executable to /bin/sh 37031 1727204381.97002: variable 'ansible_shell_executable' from source: unknown 37031 1727204381.97077: variable 'ansible_connection' from source: unknown 37031 1727204381.97084: variable 'ansible_module_compression' from source: unknown 37031 1727204381.97091: variable 'ansible_shell_type' from source: unknown 37031 1727204381.97096: variable 'ansible_shell_executable' from source: unknown 37031 1727204381.97102: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204381.97109: variable 'ansible_pipelining' from source: unknown 37031 1727204381.97114: variable 'ansible_timeout' from source: unknown 37031 1727204381.97123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204381.97362: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204381.97481: variable 'omit' from source: magic vars 37031 1727204381.97492: starting attempt loop 37031 1727204381.97497: running the handler 37031 1727204381.97516: handler run complete 37031 1727204381.97529: attempt loop complete, returning result 37031 1727204381.97535: _execute() done 37031 1727204381.97541: dumping result to json 37031 1727204381.97547: done dumping result, returning 37031 1727204381.97623: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-b754-dfb8-00000000013b] 37031 1727204381.97633: sending task result for task 0affcd87-79f5-b754-dfb8-00000000013b 37031 1727204381.97739: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000013b 37031 1727204381.97746: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 37031 1727204381.97817: no more pending results, returning what we have 37031 1727204381.97820: results queue empty 37031 1727204381.97821: checking for any_errors_fatal 37031 1727204381.97827: done checking for any_errors_fatal 37031 1727204381.97827: checking for max_fail_percentage 37031 1727204381.97829: done checking for max_fail_percentage 37031 1727204381.97830: checking to see if all hosts have failed and the running result is not ok 37031 1727204381.97831: done checking to see if all hosts have failed 37031 1727204381.97831: getting the remaining hosts for this loop 37031 1727204381.97833: done getting the remaining hosts for this loop 37031 1727204381.97837: getting the next task for host managed-node2 37031 1727204381.97846: done getting next task for host managed-node2 37031 1727204381.97849: ^ task is: TASK: Show current_interfaces 37031 1727204381.97852: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204381.97856: getting variables 37031 1727204381.97858: in VariableManager get_vars() 37031 1727204381.97905: Calling all_inventory to load vars for managed-node2 37031 1727204381.97908: Calling groups_inventory to load vars for managed-node2 37031 1727204381.97910: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204381.97919: Calling all_plugins_play to load vars for managed-node2 37031 1727204381.97921: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204381.97924: Calling groups_plugins_play to load vars for managed-node2 37031 1727204381.98124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204381.98316: done with get_vars() 37031 1727204381.98327: done getting variables 37031 1727204381.98449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:59:41 -0400 (0:00:00.055) 0:00:04.529 ***** 37031 1727204381.98481: entering _queue_task() for managed-node2/debug 37031 1727204381.98483: Creating lock for debug 37031 1727204381.98938: worker is 1 (out of 1 available) 37031 1727204381.98949: exiting _queue_task() for managed-node2/debug 37031 1727204381.98961: done queuing things up, now waiting for results queue to drain 37031 1727204381.98962: waiting for pending results... 37031 1727204381.99846: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 37031 1727204382.00074: in run() - task 0affcd87-79f5-b754-dfb8-00000000012c 37031 1727204382.00095: variable 'ansible_search_path' from source: unknown 37031 1727204382.00103: variable 'ansible_search_path' from source: unknown 37031 1727204382.00262: calling self._execute() 37031 1727204382.00339: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.00369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.00383: variable 'omit' from source: magic vars 37031 1727204382.01003: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.01029: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.01041: variable 'omit' from source: magic vars 37031 1727204382.01093: variable 'omit' from source: magic vars 37031 1727204382.01207: variable 'current_interfaces' from source: set_fact 37031 1727204382.01250: variable 'omit' from source: magic vars 37031 1727204382.01298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204382.01392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204382.01422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204382.01450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.01472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.01515: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204382.01524: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.01533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.01652: Set connection var ansible_connection to ssh 37031 1727204382.01670: Set connection var ansible_shell_type to sh 37031 1727204382.01684: Set connection var ansible_pipelining to False 37031 1727204382.01697: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204382.01712: Set connection var ansible_timeout to 10 37031 1727204382.01727: Set connection var ansible_shell_executable to /bin/sh 37031 1727204382.01766: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.01778: variable 'ansible_connection' from source: unknown 37031 1727204382.01784: variable 'ansible_module_compression' from source: unknown 37031 1727204382.01790: variable 'ansible_shell_type' from source: unknown 37031 1727204382.01796: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.01802: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.01808: variable 'ansible_pipelining' from source: unknown 37031 1727204382.01817: variable 'ansible_timeout' from source: unknown 37031 1727204382.01825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.01988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204382.02006: variable 'omit' from source: magic vars 37031 1727204382.02016: starting attempt loop 37031 1727204382.02023: running the handler 37031 1727204382.02084: handler run complete 37031 1727204382.02106: attempt loop complete, returning result 37031 1727204382.02114: _execute() done 37031 1727204382.02121: dumping result to json 37031 1727204382.02127: done dumping result, returning 37031 1727204382.02137: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-b754-dfb8-00000000012c] 37031 1727204382.02151: sending task result for task 0affcd87-79f5-b754-dfb8-00000000012c ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 37031 1727204382.02309: no more pending results, returning what we have 37031 1727204382.02312: results queue empty 37031 1727204382.02313: checking for any_errors_fatal 37031 1727204382.02318: done checking for any_errors_fatal 37031 1727204382.02318: checking for max_fail_percentage 37031 1727204382.02320: done checking for max_fail_percentage 37031 1727204382.02321: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.02322: done checking to see if all hosts have failed 37031 1727204382.02323: getting the remaining hosts for this loop 37031 1727204382.02324: done getting the remaining hosts for this loop 37031 1727204382.02328: getting the next task for host managed-node2 37031 1727204382.02336: done getting next task for host managed-node2 37031 1727204382.02340: ^ task is: TASK: Include the task 'manage_test_interface.yml' 37031 1727204382.02342: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.02346: getting variables 37031 1727204382.02348: in VariableManager get_vars() 37031 1727204382.02398: Calling all_inventory to load vars for managed-node2 37031 1727204382.02401: Calling groups_inventory to load vars for managed-node2 37031 1727204382.02404: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.02414: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.02417: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.02420: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.02697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.02946: done with get_vars() 37031 1727204382.02966: done getting variables 37031 1727204382.03131: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000012c 37031 1727204382.03134: WORKER PROCESS EXITING TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.047) 0:00:04.576 ***** 37031 1727204382.03207: entering _queue_task() for managed-node2/include_tasks 37031 1727204382.03685: worker is 1 (out of 1 available) 37031 1727204382.03699: exiting _queue_task() for managed-node2/include_tasks 37031 1727204382.03711: done queuing things up, now waiting for results queue to drain 37031 1727204382.03713: waiting for pending results... 37031 1727204382.03984: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 37031 1727204382.04078: in run() - task 0affcd87-79f5-b754-dfb8-00000000000c 37031 1727204382.04096: variable 'ansible_search_path' from source: unknown 37031 1727204382.04138: calling self._execute() 37031 1727204382.04223: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.04233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.04244: variable 'omit' from source: magic vars 37031 1727204382.04908: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.04925: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.04935: _execute() done 37031 1727204382.04944: dumping result to json 37031 1727204382.04951: done dumping result, returning 37031 1727204382.04963: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-b754-dfb8-00000000000c] 37031 1727204382.04974: sending task result for task 0affcd87-79f5-b754-dfb8-00000000000c 37031 1727204382.05085: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000000c 37031 1727204382.05093: WORKER PROCESS EXITING 37031 1727204382.05131: no more pending results, returning what we have 37031 1727204382.05137: in VariableManager get_vars() 37031 1727204382.05190: Calling all_inventory to load vars for managed-node2 37031 1727204382.05193: Calling groups_inventory to load vars for managed-node2 37031 1727204382.05195: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.05208: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.05211: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.05214: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.05474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.05684: done with get_vars() 37031 1727204382.05698: variable 'ansible_search_path' from source: unknown 37031 1727204382.05713: we have included files to process 37031 1727204382.05714: generating all_blocks data 37031 1727204382.05716: done generating all_blocks data 37031 1727204382.05720: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204382.05722: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204382.05724: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204382.06844: in VariableManager get_vars() 37031 1727204382.06871: done with get_vars() 37031 1727204382.07093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 37031 1727204382.07703: done processing included file 37031 1727204382.07705: iterating over new_blocks loaded from include file 37031 1727204382.07706: in VariableManager get_vars() 37031 1727204382.07724: done with get_vars() 37031 1727204382.07726: filtering new block on tags 37031 1727204382.07760: done filtering new block on tags 37031 1727204382.07762: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 37031 1727204382.07770: extending task lists for all hosts with included blocks 37031 1727204382.07942: done extending task lists 37031 1727204382.07943: done processing included files 37031 1727204382.07944: results queue empty 37031 1727204382.07945: checking for any_errors_fatal 37031 1727204382.07948: done checking for any_errors_fatal 37031 1727204382.07948: checking for max_fail_percentage 37031 1727204382.07949: done checking for max_fail_percentage 37031 1727204382.07950: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.07951: done checking to see if all hosts have failed 37031 1727204382.07951: getting the remaining hosts for this loop 37031 1727204382.07955: done getting the remaining hosts for this loop 37031 1727204382.07957: getting the next task for host managed-node2 37031 1727204382.07961: done getting next task for host managed-node2 37031 1727204382.07962: ^ task is: TASK: Ensure state in ["present", "absent"] 37031 1727204382.07966: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.07968: getting variables 37031 1727204382.07969: in VariableManager get_vars() 37031 1727204382.07980: Calling all_inventory to load vars for managed-node2 37031 1727204382.07982: Calling groups_inventory to load vars for managed-node2 37031 1727204382.08002: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.08008: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.08010: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.08012: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.08134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.08339: done with get_vars() 37031 1727204382.08349: done getting variables 37031 1727204382.08418: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.052) 0:00:04.629 ***** 37031 1727204382.08456: entering _queue_task() for managed-node2/fail 37031 1727204382.08458: Creating lock for fail 37031 1727204382.09347: worker is 1 (out of 1 available) 37031 1727204382.09361: exiting _queue_task() for managed-node2/fail 37031 1727204382.09374: done queuing things up, now waiting for results queue to drain 37031 1727204382.09375: waiting for pending results... 37031 1727204382.09656: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 37031 1727204382.09805: in run() - task 0affcd87-79f5-b754-dfb8-000000000156 37031 1727204382.09829: variable 'ansible_search_path' from source: unknown 37031 1727204382.09838: variable 'ansible_search_path' from source: unknown 37031 1727204382.09888: calling self._execute() 37031 1727204382.09994: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.10007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.10019: variable 'omit' from source: magic vars 37031 1727204382.10427: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.10447: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.10615: variable 'state' from source: include params 37031 1727204382.10627: Evaluated conditional (state not in ["present", "absent"]): False 37031 1727204382.10634: when evaluation is False, skipping this task 37031 1727204382.10641: _execute() done 37031 1727204382.10647: dumping result to json 37031 1727204382.10658: done dumping result, returning 37031 1727204382.10670: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-b754-dfb8-000000000156] 37031 1727204382.10679: sending task result for task 0affcd87-79f5-b754-dfb8-000000000156 skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 37031 1727204382.10855: no more pending results, returning what we have 37031 1727204382.10860: results queue empty 37031 1727204382.10861: checking for any_errors_fatal 37031 1727204382.10862: done checking for any_errors_fatal 37031 1727204382.10863: checking for max_fail_percentage 37031 1727204382.10866: done checking for max_fail_percentage 37031 1727204382.10867: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.10868: done checking to see if all hosts have failed 37031 1727204382.10869: getting the remaining hosts for this loop 37031 1727204382.10870: done getting the remaining hosts for this loop 37031 1727204382.10874: getting the next task for host managed-node2 37031 1727204382.10881: done getting next task for host managed-node2 37031 1727204382.10884: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 37031 1727204382.10886: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.10890: getting variables 37031 1727204382.10893: in VariableManager get_vars() 37031 1727204382.10937: Calling all_inventory to load vars for managed-node2 37031 1727204382.10940: Calling groups_inventory to load vars for managed-node2 37031 1727204382.10943: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.10958: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.10961: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.10967: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.11168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.11420: done with get_vars() 37031 1727204382.11431: done getting variables 37031 1727204382.11624: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000156 37031 1727204382.11627: WORKER PROCESS EXITING 37031 1727204382.11646: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.032) 0:00:04.661 ***** 37031 1727204382.11686: entering _queue_task() for managed-node2/fail 37031 1727204382.12097: worker is 1 (out of 1 available) 37031 1727204382.12109: exiting _queue_task() for managed-node2/fail 37031 1727204382.12120: done queuing things up, now waiting for results queue to drain 37031 1727204382.12121: waiting for pending results... 37031 1727204382.13028: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 37031 1727204382.13238: in run() - task 0affcd87-79f5-b754-dfb8-000000000157 37031 1727204382.13285: variable 'ansible_search_path' from source: unknown 37031 1727204382.13293: variable 'ansible_search_path' from source: unknown 37031 1727204382.13334: calling self._execute() 37031 1727204382.13440: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.13451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.13471: variable 'omit' from source: magic vars 37031 1727204382.13916: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.13944: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.14104: variable 'type' from source: play vars 37031 1727204382.14116: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 37031 1727204382.14123: when evaluation is False, skipping this task 37031 1727204382.14131: _execute() done 37031 1727204382.14144: dumping result to json 37031 1727204382.14163: done dumping result, returning 37031 1727204382.14177: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-b754-dfb8-000000000157] 37031 1727204382.14188: sending task result for task 0affcd87-79f5-b754-dfb8-000000000157 skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 37031 1727204382.14351: no more pending results, returning what we have 37031 1727204382.14358: results queue empty 37031 1727204382.14360: checking for any_errors_fatal 37031 1727204382.14369: done checking for any_errors_fatal 37031 1727204382.14371: checking for max_fail_percentage 37031 1727204382.14373: done checking for max_fail_percentage 37031 1727204382.14374: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.14375: done checking to see if all hosts have failed 37031 1727204382.14376: getting the remaining hosts for this loop 37031 1727204382.14377: done getting the remaining hosts for this loop 37031 1727204382.14382: getting the next task for host managed-node2 37031 1727204382.14389: done getting next task for host managed-node2 37031 1727204382.14392: ^ task is: TASK: Include the task 'show_interfaces.yml' 37031 1727204382.14394: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.14398: getting variables 37031 1727204382.14400: in VariableManager get_vars() 37031 1727204382.14449: Calling all_inventory to load vars for managed-node2 37031 1727204382.14452: Calling groups_inventory to load vars for managed-node2 37031 1727204382.14458: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.14478: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.14482: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.14486: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.14702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.14925: done with get_vars() 37031 1727204382.14942: done getting variables 37031 1727204382.15149: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000157 37031 1727204382.15152: WORKER PROCESS EXITING TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.035) 0:00:04.698 ***** 37031 1727204382.15369: entering _queue_task() for managed-node2/include_tasks 37031 1727204382.15746: worker is 1 (out of 1 available) 37031 1727204382.15761: exiting _queue_task() for managed-node2/include_tasks 37031 1727204382.15774: done queuing things up, now waiting for results queue to drain 37031 1727204382.15775: waiting for pending results... 37031 1727204382.16059: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 37031 1727204382.16188: in run() - task 0affcd87-79f5-b754-dfb8-000000000158 37031 1727204382.16211: variable 'ansible_search_path' from source: unknown 37031 1727204382.16220: variable 'ansible_search_path' from source: unknown 37031 1727204382.16276: calling self._execute() 37031 1727204382.16379: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.16390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.16404: variable 'omit' from source: magic vars 37031 1727204382.16823: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.16844: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.16860: _execute() done 37031 1727204382.16876: dumping result to json 37031 1727204382.16889: done dumping result, returning 37031 1727204382.16899: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-b754-dfb8-000000000158] 37031 1727204382.16908: sending task result for task 0affcd87-79f5-b754-dfb8-000000000158 37031 1727204382.17039: no more pending results, returning what we have 37031 1727204382.17045: in VariableManager get_vars() 37031 1727204382.17102: Calling all_inventory to load vars for managed-node2 37031 1727204382.17105: Calling groups_inventory to load vars for managed-node2 37031 1727204382.17108: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.17124: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.17127: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.17130: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.17419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.17638: done with get_vars() 37031 1727204382.17646: variable 'ansible_search_path' from source: unknown 37031 1727204382.17648: variable 'ansible_search_path' from source: unknown 37031 1727204382.17715: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000158 37031 1727204382.17719: WORKER PROCESS EXITING 37031 1727204382.17744: we have included files to process 37031 1727204382.17746: generating all_blocks data 37031 1727204382.17748: done generating all_blocks data 37031 1727204382.17755: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204382.17756: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204382.17759: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204382.18184: in VariableManager get_vars() 37031 1727204382.18500: done with get_vars() 37031 1727204382.18627: done processing included file 37031 1727204382.18629: iterating over new_blocks loaded from include file 37031 1727204382.18631: in VariableManager get_vars() 37031 1727204382.18650: done with get_vars() 37031 1727204382.18651: filtering new block on tags 37031 1727204382.18674: done filtering new block on tags 37031 1727204382.18677: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 37031 1727204382.18682: extending task lists for all hosts with included blocks 37031 1727204382.19655: done extending task lists 37031 1727204382.19657: done processing included files 37031 1727204382.19658: results queue empty 37031 1727204382.19659: checking for any_errors_fatal 37031 1727204382.19662: done checking for any_errors_fatal 37031 1727204382.19662: checking for max_fail_percentage 37031 1727204382.19665: done checking for max_fail_percentage 37031 1727204382.19666: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.19667: done checking to see if all hosts have failed 37031 1727204382.19668: getting the remaining hosts for this loop 37031 1727204382.19669: done getting the remaining hosts for this loop 37031 1727204382.19671: getting the next task for host managed-node2 37031 1727204382.19791: done getting next task for host managed-node2 37031 1727204382.19798: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 37031 1727204382.19801: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.19804: getting variables 37031 1727204382.19805: in VariableManager get_vars() 37031 1727204382.19819: Calling all_inventory to load vars for managed-node2 37031 1727204382.19821: Calling groups_inventory to load vars for managed-node2 37031 1727204382.19823: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.19829: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.19831: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.19834: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.20185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.21404: done with get_vars() 37031 1727204382.21417: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.061) 0:00:04.760 ***** 37031 1727204382.21527: entering _queue_task() for managed-node2/include_tasks 37031 1727204382.21845: worker is 1 (out of 1 available) 37031 1727204382.21858: exiting _queue_task() for managed-node2/include_tasks 37031 1727204382.21873: done queuing things up, now waiting for results queue to drain 37031 1727204382.21874: waiting for pending results... 37031 1727204382.22158: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 37031 1727204382.22297: in run() - task 0affcd87-79f5-b754-dfb8-00000000017f 37031 1727204382.22323: variable 'ansible_search_path' from source: unknown 37031 1727204382.22341: variable 'ansible_search_path' from source: unknown 37031 1727204382.22390: calling self._execute() 37031 1727204382.22497: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.22508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.22521: variable 'omit' from source: magic vars 37031 1727204382.22943: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.22968: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.22993: _execute() done 37031 1727204382.23001: dumping result to json 37031 1727204382.23008: done dumping result, returning 37031 1727204382.23017: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-b754-dfb8-00000000017f] 37031 1727204382.23025: sending task result for task 0affcd87-79f5-b754-dfb8-00000000017f 37031 1727204382.23303: no more pending results, returning what we have 37031 1727204382.23309: in VariableManager get_vars() 37031 1727204382.23363: Calling all_inventory to load vars for managed-node2 37031 1727204382.23473: Calling groups_inventory to load vars for managed-node2 37031 1727204382.23476: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.23491: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.23494: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.23497: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.23700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.23930: done with get_vars() 37031 1727204382.23938: variable 'ansible_search_path' from source: unknown 37031 1727204382.23939: variable 'ansible_search_path' from source: unknown 37031 1727204382.24013: we have included files to process 37031 1727204382.24014: generating all_blocks data 37031 1727204382.24016: done generating all_blocks data 37031 1727204382.24017: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204382.24018: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204382.24020: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204382.24329: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000017f 37031 1727204382.24333: WORKER PROCESS EXITING 37031 1727204382.24800: done processing included file 37031 1727204382.24802: iterating over new_blocks loaded from include file 37031 1727204382.24803: in VariableManager get_vars() 37031 1727204382.24824: done with get_vars() 37031 1727204382.24826: filtering new block on tags 37031 1727204382.24844: done filtering new block on tags 37031 1727204382.24846: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 37031 1727204382.24851: extending task lists for all hosts with included blocks 37031 1727204382.25161: done extending task lists 37031 1727204382.25163: done processing included files 37031 1727204382.25308: results queue empty 37031 1727204382.25309: checking for any_errors_fatal 37031 1727204382.25316: done checking for any_errors_fatal 37031 1727204382.25317: checking for max_fail_percentage 37031 1727204382.25318: done checking for max_fail_percentage 37031 1727204382.25319: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.25320: done checking to see if all hosts have failed 37031 1727204382.25321: getting the remaining hosts for this loop 37031 1727204382.25322: done getting the remaining hosts for this loop 37031 1727204382.25325: getting the next task for host managed-node2 37031 1727204382.25330: done getting next task for host managed-node2 37031 1727204382.25332: ^ task is: TASK: Gather current interface info 37031 1727204382.25336: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.25338: getting variables 37031 1727204382.25339: in VariableManager get_vars() 37031 1727204382.25352: Calling all_inventory to load vars for managed-node2 37031 1727204382.25357: Calling groups_inventory to load vars for managed-node2 37031 1727204382.25359: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.25366: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.25368: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.25371: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.25795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.26322: done with get_vars() 37031 1727204382.26332: done getting variables 37031 1727204382.26379: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.049) 0:00:04.809 ***** 37031 1727204382.26473: entering _queue_task() for managed-node2/command 37031 1727204382.26810: worker is 1 (out of 1 available) 37031 1727204382.26823: exiting _queue_task() for managed-node2/command 37031 1727204382.26844: done queuing things up, now waiting for results queue to drain 37031 1727204382.26845: waiting for pending results... 37031 1727204382.27116: running TaskExecutor() for managed-node2/TASK: Gather current interface info 37031 1727204382.27246: in run() - task 0affcd87-79f5-b754-dfb8-0000000001b6 37031 1727204382.27273: variable 'ansible_search_path' from source: unknown 37031 1727204382.27284: variable 'ansible_search_path' from source: unknown 37031 1727204382.27324: calling self._execute() 37031 1727204382.27422: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.27432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.27444: variable 'omit' from source: magic vars 37031 1727204382.27862: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.27883: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.27894: variable 'omit' from source: magic vars 37031 1727204382.27962: variable 'omit' from source: magic vars 37031 1727204382.28004: variable 'omit' from source: magic vars 37031 1727204382.28066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204382.28125: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204382.28170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204382.28324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.28341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.28389: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204382.28400: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.28408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.28524: Set connection var ansible_connection to ssh 37031 1727204382.28532: Set connection var ansible_shell_type to sh 37031 1727204382.28544: Set connection var ansible_pipelining to False 37031 1727204382.28560: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204382.28574: Set connection var ansible_timeout to 10 37031 1727204382.28584: Set connection var ansible_shell_executable to /bin/sh 37031 1727204382.28626: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.28634: variable 'ansible_connection' from source: unknown 37031 1727204382.28642: variable 'ansible_module_compression' from source: unknown 37031 1727204382.28649: variable 'ansible_shell_type' from source: unknown 37031 1727204382.28660: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.28671: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.28679: variable 'ansible_pipelining' from source: unknown 37031 1727204382.28686: variable 'ansible_timeout' from source: unknown 37031 1727204382.28694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.28870: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204382.28887: variable 'omit' from source: magic vars 37031 1727204382.28898: starting attempt loop 37031 1727204382.28905: running the handler 37031 1727204382.28936: _low_level_execute_command(): starting 37031 1727204382.28950: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204382.30387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204382.30391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.30424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204382.30428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.30430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.30485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204382.30488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.30492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.30552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204382.32788: stdout chunk (state=3): >>>/root <<< 37031 1727204382.32887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204382.33074: stderr chunk (state=3): >>><<< 37031 1727204382.33078: stdout chunk (state=3): >>><<< 37031 1727204382.33081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204382.33084: _low_level_execute_command(): starting 37031 1727204382.33087: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234 `" && echo ansible-tmp-1727204382.3298213-37379-79051794922234="` echo /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234 `" ) && sleep 0' 37031 1727204382.33642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204382.33648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.33659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.33678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.33714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204382.33726: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204382.33735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.33747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204382.33759: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204382.33762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204382.33772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.33782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.33793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.33800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204382.33807: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204382.33817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.33894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204382.33907: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.33918: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.33991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204382.36552: stdout chunk (state=3): >>>ansible-tmp-1727204382.3298213-37379-79051794922234=/root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234 <<< 37031 1727204382.36722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204382.36800: stderr chunk (state=3): >>><<< 37031 1727204382.36804: stdout chunk (state=3): >>><<< 37031 1727204382.36840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204382.3298213-37379-79051794922234=/root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204382.36872: variable 'ansible_module_compression' from source: unknown 37031 1727204382.37228: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204382.37232: variable 'ansible_facts' from source: unknown 37031 1727204382.37234: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/AnsiballZ_command.py 37031 1727204382.37237: Sending initial data 37031 1727204382.37239: Sent initial data (155 bytes) 37031 1727204382.38580: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204382.38586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.38589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.38591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.38593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204382.38595: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204382.38596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.38598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204382.38600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204382.38602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204382.38604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.38605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.38607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.38609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204382.38611: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204382.38613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.38614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204382.38616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.38738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.38802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204382.40874: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204382.40907: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204382.40947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpd3hciewb /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/AnsiballZ_command.py <<< 37031 1727204382.40990: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204382.41809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204382.41921: stderr chunk (state=3): >>><<< 37031 1727204382.41925: stdout chunk (state=3): >>><<< 37031 1727204382.41941: done transferring module to remote 37031 1727204382.41951: _low_level_execute_command(): starting 37031 1727204382.41958: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/ /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/AnsiballZ_command.py && sleep 0' 37031 1727204382.42689: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.42694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.42749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.42755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204382.42772: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.42775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.42787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204382.42793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.42888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.42907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.43066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204382.45321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204382.45382: stderr chunk (state=3): >>><<< 37031 1727204382.45386: stdout chunk (state=3): >>><<< 37031 1727204382.45401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204382.45404: _low_level_execute_command(): starting 37031 1727204382.45409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/AnsiballZ_command.py && sleep 0' 37031 1727204382.45869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.45884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.45910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204382.45917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204382.45922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.45930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.45939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.45945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.46013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.46019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.46075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204382.67653: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:59:42.671982", "end": "2024-09-24 14:59:42.675818", "delta": "0:00:00.003836", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204382.69000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204382.69004: stdout chunk (state=3): >>><<< 37031 1727204382.69007: stderr chunk (state=3): >>><<< 37031 1727204382.69071: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:59:42.671982", "end": "2024-09-24 14:59:42.675818", "delta": "0:00:00.003836", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204382.69076: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204382.69175: _low_level_execute_command(): starting 37031 1727204382.69178: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204382.3298213-37379-79051794922234/ > /dev/null 2>&1 && sleep 0' 37031 1727204382.70362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204382.70490: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204382.70507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204382.70570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.70573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204382.70584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204382.70587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204382.70650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204382.70682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204382.70686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204382.70736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 37031 1727204382.73179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204382.73251: stderr chunk (state=3): >>><<< 37031 1727204382.73255: stdout chunk (state=3): >>><<< 37031 1727204382.73371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 37031 1727204382.73377: handler run complete 37031 1727204382.73379: Evaluated conditional (False): False 37031 1727204382.73381: attempt loop complete, returning result 37031 1727204382.73383: _execute() done 37031 1727204382.73384: dumping result to json 37031 1727204382.73386: done dumping result, returning 37031 1727204382.73388: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-b754-dfb8-0000000001b6] 37031 1727204382.73390: sending task result for task 0affcd87-79f5-b754-dfb8-0000000001b6 37031 1727204382.73588: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000001b6 37031 1727204382.73591: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003836", "end": "2024-09-24 14:59:42.675818", "rc": 0, "start": "2024-09-24 14:59:42.671982" } STDOUT: bonding_masters eth0 lo 37031 1727204382.73689: no more pending results, returning what we have 37031 1727204382.73693: results queue empty 37031 1727204382.73693: checking for any_errors_fatal 37031 1727204382.73695: done checking for any_errors_fatal 37031 1727204382.73695: checking for max_fail_percentage 37031 1727204382.73697: done checking for max_fail_percentage 37031 1727204382.73698: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.73699: done checking to see if all hosts have failed 37031 1727204382.73700: getting the remaining hosts for this loop 37031 1727204382.73702: done getting the remaining hosts for this loop 37031 1727204382.73706: getting the next task for host managed-node2 37031 1727204382.73713: done getting next task for host managed-node2 37031 1727204382.73716: ^ task is: TASK: Set current_interfaces 37031 1727204382.73723: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.73727: getting variables 37031 1727204382.73729: in VariableManager get_vars() 37031 1727204382.73781: Calling all_inventory to load vars for managed-node2 37031 1727204382.73784: Calling groups_inventory to load vars for managed-node2 37031 1727204382.73786: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.73798: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.73800: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.73804: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.74777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.75185: done with get_vars() 37031 1727204382.75197: done getting variables 37031 1727204382.75378: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.489) 0:00:05.298 ***** 37031 1727204382.75411: entering _queue_task() for managed-node2/set_fact 37031 1727204382.76044: worker is 1 (out of 1 available) 37031 1727204382.76055: exiting _queue_task() for managed-node2/set_fact 37031 1727204382.76069: done queuing things up, now waiting for results queue to drain 37031 1727204382.76070: waiting for pending results... 37031 1727204382.77010: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 37031 1727204382.77137: in run() - task 0affcd87-79f5-b754-dfb8-0000000001b7 37031 1727204382.77314: variable 'ansible_search_path' from source: unknown 37031 1727204382.77323: variable 'ansible_search_path' from source: unknown 37031 1727204382.77361: calling self._execute() 37031 1727204382.77558: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.77573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.77588: variable 'omit' from source: magic vars 37031 1727204382.78523: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.79041: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.79319: variable 'omit' from source: magic vars 37031 1727204382.79386: variable 'omit' from source: magic vars 37031 1727204382.79949: variable '_current_interfaces' from source: set_fact 37031 1727204382.80024: variable 'omit' from source: magic vars 37031 1727204382.80123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204382.80331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204382.80424: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204382.80448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.80465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.80540: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204382.80622: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.80633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.80851: Set connection var ansible_connection to ssh 37031 1727204382.80857: Set connection var ansible_shell_type to sh 37031 1727204382.80870: Set connection var ansible_pipelining to False 37031 1727204382.80880: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204382.80887: Set connection var ansible_timeout to 10 37031 1727204382.80893: Set connection var ansible_shell_executable to /bin/sh 37031 1727204382.80920: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.80926: variable 'ansible_connection' from source: unknown 37031 1727204382.80931: variable 'ansible_module_compression' from source: unknown 37031 1727204382.80943: variable 'ansible_shell_type' from source: unknown 37031 1727204382.80954: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.81057: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.81068: variable 'ansible_pipelining' from source: unknown 37031 1727204382.81074: variable 'ansible_timeout' from source: unknown 37031 1727204382.81080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.81233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204382.81389: variable 'omit' from source: magic vars 37031 1727204382.81418: starting attempt loop 37031 1727204382.81425: running the handler 37031 1727204382.81442: handler run complete 37031 1727204382.81458: attempt loop complete, returning result 37031 1727204382.81468: _execute() done 37031 1727204382.81477: dumping result to json 37031 1727204382.81490: done dumping result, returning 37031 1727204382.81504: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-b754-dfb8-0000000001b7] 37031 1727204382.81602: sending task result for task 0affcd87-79f5-b754-dfb8-0000000001b7 37031 1727204382.81709: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000001b7 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 37031 1727204382.81773: no more pending results, returning what we have 37031 1727204382.81777: results queue empty 37031 1727204382.81778: checking for any_errors_fatal 37031 1727204382.81785: done checking for any_errors_fatal 37031 1727204382.81786: checking for max_fail_percentage 37031 1727204382.81788: done checking for max_fail_percentage 37031 1727204382.81789: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.81790: done checking to see if all hosts have failed 37031 1727204382.81791: getting the remaining hosts for this loop 37031 1727204382.81793: done getting the remaining hosts for this loop 37031 1727204382.81797: getting the next task for host managed-node2 37031 1727204382.81808: done getting next task for host managed-node2 37031 1727204382.81810: ^ task is: TASK: Show current_interfaces 37031 1727204382.81814: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.81818: getting variables 37031 1727204382.81819: in VariableManager get_vars() 37031 1727204382.81863: Calling all_inventory to load vars for managed-node2 37031 1727204382.81868: Calling groups_inventory to load vars for managed-node2 37031 1727204382.81870: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.81881: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.81884: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.81887: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.82106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.82325: done with get_vars() 37031 1727204382.82336: done getting variables 37031 1727204382.82730: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204382.82758: WORKER PROCESS EXITING TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.073) 0:00:05.372 ***** 37031 1727204382.82813: entering _queue_task() for managed-node2/debug 37031 1727204382.83441: worker is 1 (out of 1 available) 37031 1727204382.83453: exiting _queue_task() for managed-node2/debug 37031 1727204382.83469: done queuing things up, now waiting for results queue to drain 37031 1727204382.83470: waiting for pending results... 37031 1727204382.84844: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 37031 1727204382.85083: in run() - task 0affcd87-79f5-b754-dfb8-000000000180 37031 1727204382.85087: variable 'ansible_search_path' from source: unknown 37031 1727204382.85089: variable 'ansible_search_path' from source: unknown 37031 1727204382.85092: calling self._execute() 37031 1727204382.85095: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.85097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.85100: variable 'omit' from source: magic vars 37031 1727204382.85733: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.85745: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.85751: variable 'omit' from source: magic vars 37031 1727204382.85799: variable 'omit' from source: magic vars 37031 1727204382.86005: variable 'current_interfaces' from source: set_fact 37031 1727204382.86034: variable 'omit' from source: magic vars 37031 1727204382.86078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204382.86112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204382.86133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204382.86150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.86166: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.86318: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204382.86321: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.86324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.86424: Set connection var ansible_connection to ssh 37031 1727204382.86427: Set connection var ansible_shell_type to sh 37031 1727204382.86434: Set connection var ansible_pipelining to False 37031 1727204382.86443: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204382.86448: Set connection var ansible_timeout to 10 37031 1727204382.86453: Set connection var ansible_shell_executable to /bin/sh 37031 1727204382.86485: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.86489: variable 'ansible_connection' from source: unknown 37031 1727204382.86491: variable 'ansible_module_compression' from source: unknown 37031 1727204382.86493: variable 'ansible_shell_type' from source: unknown 37031 1727204382.86496: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.86498: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.86500: variable 'ansible_pipelining' from source: unknown 37031 1727204382.86502: variable 'ansible_timeout' from source: unknown 37031 1727204382.86509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.86646: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204382.86658: variable 'omit' from source: magic vars 37031 1727204382.86665: starting attempt loop 37031 1727204382.86668: running the handler 37031 1727204382.86718: handler run complete 37031 1727204382.86731: attempt loop complete, returning result 37031 1727204382.86734: _execute() done 37031 1727204382.86736: dumping result to json 37031 1727204382.86739: done dumping result, returning 37031 1727204382.86746: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-b754-dfb8-000000000180] 37031 1727204382.86749: sending task result for task 0affcd87-79f5-b754-dfb8-000000000180 37031 1727204382.86840: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000180 37031 1727204382.86842: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 37031 1727204382.86889: no more pending results, returning what we have 37031 1727204382.86892: results queue empty 37031 1727204382.86893: checking for any_errors_fatal 37031 1727204382.86899: done checking for any_errors_fatal 37031 1727204382.86899: checking for max_fail_percentage 37031 1727204382.86901: done checking for max_fail_percentage 37031 1727204382.86902: checking to see if all hosts have failed and the running result is not ok 37031 1727204382.86903: done checking to see if all hosts have failed 37031 1727204382.86904: getting the remaining hosts for this loop 37031 1727204382.86905: done getting the remaining hosts for this loop 37031 1727204382.86909: getting the next task for host managed-node2 37031 1727204382.86918: done getting next task for host managed-node2 37031 1727204382.86921: ^ task is: TASK: Install iproute 37031 1727204382.86923: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204382.86928: getting variables 37031 1727204382.86930: in VariableManager get_vars() 37031 1727204382.86975: Calling all_inventory to load vars for managed-node2 37031 1727204382.86978: Calling groups_inventory to load vars for managed-node2 37031 1727204382.86980: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204382.86989: Calling all_plugins_play to load vars for managed-node2 37031 1727204382.86991: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204382.86994: Calling groups_plugins_play to load vars for managed-node2 37031 1727204382.87159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204382.87369: done with get_vars() 37031 1727204382.87381: done getting variables 37031 1727204382.87445: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:59:42 -0400 (0:00:00.046) 0:00:05.419 ***** 37031 1727204382.87477: entering _queue_task() for managed-node2/package 37031 1727204382.88426: worker is 1 (out of 1 available) 37031 1727204382.88440: exiting _queue_task() for managed-node2/package 37031 1727204382.88453: done queuing things up, now waiting for results queue to drain 37031 1727204382.88454: waiting for pending results... 37031 1727204382.89261: running TaskExecutor() for managed-node2/TASK: Install iproute 37031 1727204382.89457: in run() - task 0affcd87-79f5-b754-dfb8-000000000159 37031 1727204382.89473: variable 'ansible_search_path' from source: unknown 37031 1727204382.89476: variable 'ansible_search_path' from source: unknown 37031 1727204382.89511: calling self._execute() 37031 1727204382.89812: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.89815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.89825: variable 'omit' from source: magic vars 37031 1727204382.92261: variable 'ansible_distribution_major_version' from source: facts 37031 1727204382.92274: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204382.92280: variable 'omit' from source: magic vars 37031 1727204382.92320: variable 'omit' from source: magic vars 37031 1727204382.92512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204382.97019: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204382.97097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204382.97135: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204382.97173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204382.97201: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204382.97299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204382.97327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204382.97351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204382.97396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204382.97409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204382.97723: variable '__network_is_ostree' from source: set_fact 37031 1727204382.97727: variable 'omit' from source: magic vars 37031 1727204382.97759: variable 'omit' from source: magic vars 37031 1727204382.97790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204382.97814: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204382.97831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204382.97848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.97861: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204382.97988: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204382.97991: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.97994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.98093: Set connection var ansible_connection to ssh 37031 1727204382.98096: Set connection var ansible_shell_type to sh 37031 1727204382.98102: Set connection var ansible_pipelining to False 37031 1727204382.98111: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204382.98116: Set connection var ansible_timeout to 10 37031 1727204382.98122: Set connection var ansible_shell_executable to /bin/sh 37031 1727204382.98149: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.98152: variable 'ansible_connection' from source: unknown 37031 1727204382.98154: variable 'ansible_module_compression' from source: unknown 37031 1727204382.98161: variable 'ansible_shell_type' from source: unknown 37031 1727204382.98163: variable 'ansible_shell_executable' from source: unknown 37031 1727204382.98166: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204382.98172: variable 'ansible_pipelining' from source: unknown 37031 1727204382.98175: variable 'ansible_timeout' from source: unknown 37031 1727204382.98182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204382.98268: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204382.98279: variable 'omit' from source: magic vars 37031 1727204382.98290: starting attempt loop 37031 1727204382.98293: running the handler 37031 1727204382.98299: variable 'ansible_facts' from source: unknown 37031 1727204382.98302: variable 'ansible_facts' from source: unknown 37031 1727204382.98336: _low_level_execute_command(): starting 37031 1727204382.98343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204383.02333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204383.02390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.02406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.02423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.02468: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.02482: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204383.02494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.02509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204383.02603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204383.02616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204383.02628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.02641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.02656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.02670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.02681: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204383.02695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.02775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204383.02825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204383.02840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204383.03039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204383.04671: stdout chunk (state=3): >>>/root <<< 37031 1727204383.04862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204383.04868: stdout chunk (state=3): >>><<< 37031 1727204383.04870: stderr chunk (state=3): >>><<< 37031 1727204383.04983: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204383.04986: _low_level_execute_command(): starting 37031 1727204383.04990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431 `" && echo ansible-tmp-1727204383.0489204-37414-103688801790431="` echo /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431 `" ) && sleep 0' 37031 1727204383.06406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.06410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.06442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.06445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.06447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.06711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204383.06897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204383.06900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204383.06969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204383.08890: stdout chunk (state=3): >>>ansible-tmp-1727204383.0489204-37414-103688801790431=/root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431 <<< 37031 1727204383.09088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204383.09093: stdout chunk (state=3): >>><<< 37031 1727204383.09097: stderr chunk (state=3): >>><<< 37031 1727204383.09121: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204383.0489204-37414-103688801790431=/root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204383.09155: variable 'ansible_module_compression' from source: unknown 37031 1727204383.09227: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 37031 1727204383.09231: ANSIBALLZ: Acquiring lock 37031 1727204383.09234: ANSIBALLZ: Lock acquired: 140694173153808 37031 1727204383.09236: ANSIBALLZ: Creating module 37031 1727204383.39126: ANSIBALLZ: Writing module into payload 37031 1727204383.39753: ANSIBALLZ: Writing module 37031 1727204383.39792: ANSIBALLZ: Renaming module 37031 1727204383.39815: ANSIBALLZ: Done creating module 37031 1727204383.39844: variable 'ansible_facts' from source: unknown 37031 1727204383.39938: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/AnsiballZ_dnf.py 37031 1727204383.40095: Sending initial data 37031 1727204383.40098: Sent initial data (152 bytes) 37031 1727204383.41894: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204383.41930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.41939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.41953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.42070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.42087: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204383.42099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.42112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204383.42119: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204383.42134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204383.42137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.42146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.42162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.42171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.42178: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204383.42186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.42378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204383.42397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204383.42408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204383.42486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204383.44311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204383.44347: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204383.44393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp04s6gl2t /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/AnsiballZ_dnf.py <<< 37031 1727204383.44427: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204383.45984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204383.45988: stderr chunk (state=3): >>><<< 37031 1727204383.45995: stdout chunk (state=3): >>><<< 37031 1727204383.46017: done transferring module to remote 37031 1727204383.46028: _low_level_execute_command(): starting 37031 1727204383.46034: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/ /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/AnsiballZ_dnf.py && sleep 0' 37031 1727204383.46684: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204383.46692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.46702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.46714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.46752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.46762: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204383.46774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.46787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204383.46793: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204383.46799: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204383.46807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.46815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.46825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.46832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.46839: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204383.46847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.46921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204383.46934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204383.46944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204383.47009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204383.48774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204383.48855: stderr chunk (state=3): >>><<< 37031 1727204383.48860: stdout chunk (state=3): >>><<< 37031 1727204383.48882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204383.48885: _low_level_execute_command(): starting 37031 1727204383.48890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/AnsiballZ_dnf.py && sleep 0' 37031 1727204383.49527: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204383.49534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.49545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.49566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.49604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.49611: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204383.49623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.49636: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204383.49642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204383.49649: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204383.49660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204383.49673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204383.49685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204383.49693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204383.49699: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204383.49708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204383.49783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204383.49797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204383.49807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204383.49886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.43112: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 37031 1727204384.47274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204384.47329: stderr chunk (state=3): >>><<< 37031 1727204384.47333: stdout chunk (state=3): >>><<< 37031 1727204384.47348: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204384.47387: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204384.47394: _low_level_execute_command(): starting 37031 1727204384.47400: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204383.0489204-37414-103688801790431/ > /dev/null 2>&1 && sleep 0' 37031 1727204384.47873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.47881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.47911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.47923: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.47981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.47993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.48042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.49927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.49957: stderr chunk (state=3): >>><<< 37031 1727204384.49961: stdout chunk (state=3): >>><<< 37031 1727204384.50372: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.50377: handler run complete 37031 1727204384.50380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204384.50383: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204384.50413: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204384.50452: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204384.50496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204384.50581: variable '__install_status' from source: unknown 37031 1727204384.50612: Evaluated conditional (__install_status is success): True 37031 1727204384.50634: attempt loop complete, returning result 37031 1727204384.50642: _execute() done 37031 1727204384.50656: dumping result to json 37031 1727204384.50670: done dumping result, returning 37031 1727204384.50683: done running TaskExecutor() for managed-node2/TASK: Install iproute [0affcd87-79f5-b754-dfb8-000000000159] 37031 1727204384.50694: sending task result for task 0affcd87-79f5-b754-dfb8-000000000159 37031 1727204384.50823: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000159 37031 1727204384.50833: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 37031 1727204384.50978: no more pending results, returning what we have 37031 1727204384.50981: results queue empty 37031 1727204384.50981: checking for any_errors_fatal 37031 1727204384.50984: done checking for any_errors_fatal 37031 1727204384.50985: checking for max_fail_percentage 37031 1727204384.50986: done checking for max_fail_percentage 37031 1727204384.50987: checking to see if all hosts have failed and the running result is not ok 37031 1727204384.50988: done checking to see if all hosts have failed 37031 1727204384.50988: getting the remaining hosts for this loop 37031 1727204384.50990: done getting the remaining hosts for this loop 37031 1727204384.50994: getting the next task for host managed-node2 37031 1727204384.50999: done getting next task for host managed-node2 37031 1727204384.51002: ^ task is: TASK: Create veth interface {{ interface }} 37031 1727204384.51004: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204384.51007: getting variables 37031 1727204384.51008: in VariableManager get_vars() 37031 1727204384.51040: Calling all_inventory to load vars for managed-node2 37031 1727204384.51042: Calling groups_inventory to load vars for managed-node2 37031 1727204384.51044: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204384.51054: Calling all_plugins_play to load vars for managed-node2 37031 1727204384.51056: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204384.51059: Calling groups_plugins_play to load vars for managed-node2 37031 1727204384.51218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204384.51433: done with get_vars() 37031 1727204384.51445: done getting variables 37031 1727204384.51512: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204384.51651: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:59:44 -0400 (0:00:01.642) 0:00:07.061 ***** 37031 1727204384.51697: entering _queue_task() for managed-node2/command 37031 1727204384.52004: worker is 1 (out of 1 available) 37031 1727204384.52017: exiting _queue_task() for managed-node2/command 37031 1727204384.52030: done queuing things up, now waiting for results queue to drain 37031 1727204384.52033: waiting for pending results... 37031 1727204384.52427: running TaskExecutor() for managed-node2/TASK: Create veth interface veth0 37031 1727204384.52433: in run() - task 0affcd87-79f5-b754-dfb8-00000000015a 37031 1727204384.52436: variable 'ansible_search_path' from source: unknown 37031 1727204384.52439: variable 'ansible_search_path' from source: unknown 37031 1727204384.52724: variable 'interface' from source: play vars 37031 1727204384.52804: variable 'interface' from source: play vars 37031 1727204384.52885: variable 'interface' from source: play vars 37031 1727204384.53047: Loaded config def from plugin (lookup/items) 37031 1727204384.53058: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 37031 1727204384.53078: variable 'omit' from source: magic vars 37031 1727204384.53198: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.53218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.53229: variable 'omit' from source: magic vars 37031 1727204384.53459: variable 'ansible_distribution_major_version' from source: facts 37031 1727204384.53471: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204384.53673: variable 'type' from source: play vars 37031 1727204384.53677: variable 'state' from source: include params 37031 1727204384.53689: variable 'interface' from source: play vars 37031 1727204384.53698: variable 'current_interfaces' from source: set_fact 37031 1727204384.53705: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 37031 1727204384.53711: variable 'omit' from source: magic vars 37031 1727204384.53747: variable 'omit' from source: magic vars 37031 1727204384.53780: variable 'item' from source: unknown 37031 1727204384.53834: variable 'item' from source: unknown 37031 1727204384.53846: variable 'omit' from source: magic vars 37031 1727204384.53875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204384.53897: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204384.53911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204384.53950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204384.53967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204384.53994: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204384.53997: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.54001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.54074: Set connection var ansible_connection to ssh 37031 1727204384.54079: Set connection var ansible_shell_type to sh 37031 1727204384.54084: Set connection var ansible_pipelining to False 37031 1727204384.54091: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204384.54097: Set connection var ansible_timeout to 10 37031 1727204384.54102: Set connection var ansible_shell_executable to /bin/sh 37031 1727204384.54120: variable 'ansible_shell_executable' from source: unknown 37031 1727204384.54123: variable 'ansible_connection' from source: unknown 37031 1727204384.54126: variable 'ansible_module_compression' from source: unknown 37031 1727204384.54128: variable 'ansible_shell_type' from source: unknown 37031 1727204384.54131: variable 'ansible_shell_executable' from source: unknown 37031 1727204384.54134: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.54136: variable 'ansible_pipelining' from source: unknown 37031 1727204384.54141: variable 'ansible_timeout' from source: unknown 37031 1727204384.54143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.54238: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204384.54250: variable 'omit' from source: magic vars 37031 1727204384.54256: starting attempt loop 37031 1727204384.54259: running the handler 37031 1727204384.54268: _low_level_execute_command(): starting 37031 1727204384.54276: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204384.54808: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.54839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.54857: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204384.54871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.54917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.54920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.54930: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.54984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.56581: stdout chunk (state=3): >>>/root <<< 37031 1727204384.56682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.56717: stderr chunk (state=3): >>><<< 37031 1727204384.56729: stdout chunk (state=3): >>><<< 37031 1727204384.56755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.56776: _low_level_execute_command(): starting 37031 1727204384.56796: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845 `" && echo ansible-tmp-1727204384.5676198-37650-64882985681845="` echo /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845 `" ) && sleep 0' 37031 1727204384.57333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204384.57344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.57354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.57380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.57422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204384.57426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.57428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204384.57430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.57490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.57497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.57537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.59381: stdout chunk (state=3): >>>ansible-tmp-1727204384.5676198-37650-64882985681845=/root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845 <<< 37031 1727204384.59499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.59558: stderr chunk (state=3): >>><<< 37031 1727204384.59561: stdout chunk (state=3): >>><<< 37031 1727204384.59577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204384.5676198-37650-64882985681845=/root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.59607: variable 'ansible_module_compression' from source: unknown 37031 1727204384.59652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204384.59682: variable 'ansible_facts' from source: unknown 37031 1727204384.59746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/AnsiballZ_command.py 37031 1727204384.59859: Sending initial data 37031 1727204384.59862: Sent initial data (155 bytes) 37031 1727204384.60580: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.60587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.60622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.60627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204384.60633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.60642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.60647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.60658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.60661: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204384.60672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.60728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.60734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.60745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.60799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.62519: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204384.62555: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204384.62590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpkzryirl5 /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/AnsiballZ_command.py <<< 37031 1727204384.62626: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204384.63900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.63993: stderr chunk (state=3): >>><<< 37031 1727204384.63999: stdout chunk (state=3): >>><<< 37031 1727204384.64023: done transferring module to remote 37031 1727204384.64035: _low_level_execute_command(): starting 37031 1727204384.64040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/ /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/AnsiballZ_command.py && sleep 0' 37031 1727204384.65188: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.65197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.65240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.65243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.65259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.65268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.65349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.65369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.65432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.67438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.67522: stderr chunk (state=3): >>><<< 37031 1727204384.67525: stdout chunk (state=3): >>><<< 37031 1727204384.67548: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.67551: _low_level_execute_command(): starting 37031 1727204384.67559: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/AnsiballZ_command.py && sleep 0' 37031 1727204384.68343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204384.68356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.68360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.68377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.68416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.68423: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204384.68433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.68446: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204384.68457: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204384.68461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204384.68474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.68484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.68495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.68503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.68509: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204384.68518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.68592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.68607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.68610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.68694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.82910: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:59:44.817944", "end": "2024-09-24 14:59:44.828019", "delta": "0:00:00.010075", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204384.85227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204384.85232: stderr chunk (state=3): >>><<< 37031 1727204384.85235: stdout chunk (state=3): >>><<< 37031 1727204384.85304: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-24 14:59:44.817944", "end": "2024-09-24 14:59:44.828019", "delta": "0:00:00.010075", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204384.85309: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204384.85316: _low_level_execute_command(): starting 37031 1727204384.85322: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204384.5676198-37650-64882985681845/ > /dev/null 2>&1 && sleep 0' 37031 1727204384.87406: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204384.87414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.87425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.87440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.87488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.87578: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204384.87588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.87602: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204384.87609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204384.87617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204384.87624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.87634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.87646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.87656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.87659: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204384.87671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.87746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.87803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.87808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.87968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.89938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.89942: stdout chunk (state=3): >>><<< 37031 1727204384.89950: stderr chunk (state=3): >>><<< 37031 1727204384.89990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.89996: handler run complete 37031 1727204384.90018: Evaluated conditional (False): False 37031 1727204384.90028: attempt loop complete, returning result 37031 1727204384.90050: variable 'item' from source: unknown 37031 1727204384.90132: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.010075", "end": "2024-09-24 14:59:44.828019", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-24 14:59:44.817944" } 37031 1727204384.90304: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.90308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.90311: variable 'omit' from source: magic vars 37031 1727204384.90425: variable 'ansible_distribution_major_version' from source: facts 37031 1727204384.90438: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204384.90691: variable 'type' from source: play vars 37031 1727204384.90695: variable 'state' from source: include params 37031 1727204384.90699: variable 'interface' from source: play vars 37031 1727204384.90762: variable 'current_interfaces' from source: set_fact 37031 1727204384.90771: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 37031 1727204384.90776: variable 'omit' from source: magic vars 37031 1727204384.90791: variable 'omit' from source: magic vars 37031 1727204384.90900: variable 'item' from source: unknown 37031 1727204384.91026: variable 'item' from source: unknown 37031 1727204384.91107: variable 'omit' from source: magic vars 37031 1727204384.91128: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204384.91269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204384.91272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204384.91275: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204384.91277: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.91279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.91474: Set connection var ansible_connection to ssh 37031 1727204384.91477: Set connection var ansible_shell_type to sh 37031 1727204384.91483: Set connection var ansible_pipelining to False 37031 1727204384.91491: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204384.91496: Set connection var ansible_timeout to 10 37031 1727204384.91501: Set connection var ansible_shell_executable to /bin/sh 37031 1727204384.91640: variable 'ansible_shell_executable' from source: unknown 37031 1727204384.91643: variable 'ansible_connection' from source: unknown 37031 1727204384.91646: variable 'ansible_module_compression' from source: unknown 37031 1727204384.91648: variable 'ansible_shell_type' from source: unknown 37031 1727204384.91650: variable 'ansible_shell_executable' from source: unknown 37031 1727204384.91652: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204384.91657: variable 'ansible_pipelining' from source: unknown 37031 1727204384.91659: variable 'ansible_timeout' from source: unknown 37031 1727204384.91662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204384.91879: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204384.91888: variable 'omit' from source: magic vars 37031 1727204384.91893: starting attempt loop 37031 1727204384.91901: running the handler 37031 1727204384.91909: _low_level_execute_command(): starting 37031 1727204384.91912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204384.94439: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.94443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.94641: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.94646: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.94663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204384.94671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.94748: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.94774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.94790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.94930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204384.96587: stdout chunk (state=3): >>>/root <<< 37031 1727204384.96883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204384.96886: stdout chunk (state=3): >>><<< 37031 1727204384.96889: stderr chunk (state=3): >>><<< 37031 1727204384.96990: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204384.96994: _low_level_execute_command(): starting 37031 1727204384.96996: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868 `" && echo ansible-tmp-1727204384.969059-37650-16779676910868="` echo /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868 `" ) && sleep 0' 37031 1727204384.98626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204384.98637: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.98648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.98662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.98726: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.98817: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204384.98827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.98842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204384.98849: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204384.98858: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204384.98866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204384.98877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204384.98888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204384.98896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204384.98904: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204384.98917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204384.98991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204384.99146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204384.99158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204384.99249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.01113: stdout chunk (state=3): >>>ansible-tmp-1727204384.969059-37650-16779676910868=/root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868 <<< 37031 1727204385.01297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.01301: stdout chunk (state=3): >>><<< 37031 1727204385.01307: stderr chunk (state=3): >>><<< 37031 1727204385.01328: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204384.969059-37650-16779676910868=/root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.01359: variable 'ansible_module_compression' from source: unknown 37031 1727204385.01401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204385.01423: variable 'ansible_facts' from source: unknown 37031 1727204385.01488: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/AnsiballZ_command.py 37031 1727204385.01943: Sending initial data 37031 1727204385.01946: Sent initial data (154 bytes) 37031 1727204385.02983: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.02997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.03013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.03037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.03126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.03141: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.03161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.03182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.03194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.03205: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.03217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.03232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.03247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.03267: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.03278: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.03290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.03372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.03390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.03404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.03617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.05243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204385.05278: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204385.05313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpzv1vkai4 /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/AnsiballZ_command.py <<< 37031 1727204385.05349: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204385.06673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.06678: stderr chunk (state=3): >>><<< 37031 1727204385.06680: stdout chunk (state=3): >>><<< 37031 1727204385.06770: done transferring module to remote 37031 1727204385.06773: _low_level_execute_command(): starting 37031 1727204385.06776: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/ /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/AnsiballZ_command.py && sleep 0' 37031 1727204385.08334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.08402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.08415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.08430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.08537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.08544: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.08610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.08624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.08641: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.08648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.08659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.08672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.08683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.08691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.08697: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.08709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.08791: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.08946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.08968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.09037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.10830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.10835: stdout chunk (state=3): >>><<< 37031 1727204385.10841: stderr chunk (state=3): >>><<< 37031 1727204385.10868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.10872: _low_level_execute_command(): starting 37031 1727204385.10876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/AnsiballZ_command.py && sleep 0' 37031 1727204385.12701: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.12797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.12807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.12834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.12880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.12938: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.12949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.12970: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.13004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.13011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.13020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.13028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.13044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.13074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.13081: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.13091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.13227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.13276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.13293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.13378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.26799: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:59:45.263535", "end": "2024-09-24 14:59:45.267211", "delta": "0:00:00.003676", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204385.28588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204385.28593: stdout chunk (state=3): >>><<< 37031 1727204385.28598: stderr chunk (state=3): >>><<< 37031 1727204385.29098: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-24 14:59:45.263535", "end": "2024-09-24 14:59:45.267211", "delta": "0:00:00.003676", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204385.29134: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204385.29138: _low_level_execute_command(): starting 37031 1727204385.29143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204384.969059-37650-16779676910868/ > /dev/null 2>&1 && sleep 0' 37031 1727204385.30040: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.30048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.30062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.30081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.30130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.30136: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.30146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.30163: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.30173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.30180: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.30188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.30196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.30207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.30216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.30230: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.30268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.30345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.30362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.30377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.30478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.32314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.32318: stdout chunk (state=3): >>><<< 37031 1727204385.32324: stderr chunk (state=3): >>><<< 37031 1727204385.32341: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.32347: handler run complete 37031 1727204385.32377: Evaluated conditional (False): False 37031 1727204385.32386: attempt loop complete, returning result 37031 1727204385.32404: variable 'item' from source: unknown 37031 1727204385.32496: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003676", "end": "2024-09-24 14:59:45.267211", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-24 14:59:45.263535" } 37031 1727204385.32632: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.32635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.32638: variable 'omit' from source: magic vars 37031 1727204385.32811: variable 'ansible_distribution_major_version' from source: facts 37031 1727204385.32814: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204385.32953: variable 'type' from source: play vars 37031 1727204385.32959: variable 'state' from source: include params 37031 1727204385.32968: variable 'interface' from source: play vars 37031 1727204385.32981: variable 'current_interfaces' from source: set_fact 37031 1727204385.32987: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 37031 1727204385.32991: variable 'omit' from source: magic vars 37031 1727204385.33002: variable 'omit' from source: magic vars 37031 1727204385.33029: variable 'item' from source: unknown 37031 1727204385.33114: variable 'item' from source: unknown 37031 1727204385.33132: variable 'omit' from source: magic vars 37031 1727204385.33263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204385.33269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204385.33272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204385.33274: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204385.33276: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.33278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.33307: Set connection var ansible_connection to ssh 37031 1727204385.33315: Set connection var ansible_shell_type to sh 37031 1727204385.33323: Set connection var ansible_pipelining to False 37031 1727204385.33329: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204385.33334: Set connection var ansible_timeout to 10 37031 1727204385.33340: Set connection var ansible_shell_executable to /bin/sh 37031 1727204385.33367: variable 'ansible_shell_executable' from source: unknown 37031 1727204385.33370: variable 'ansible_connection' from source: unknown 37031 1727204385.33373: variable 'ansible_module_compression' from source: unknown 37031 1727204385.33377: variable 'ansible_shell_type' from source: unknown 37031 1727204385.33380: variable 'ansible_shell_executable' from source: unknown 37031 1727204385.33382: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.33384: variable 'ansible_pipelining' from source: unknown 37031 1727204385.33386: variable 'ansible_timeout' from source: unknown 37031 1727204385.33388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.33493: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204385.33504: variable 'omit' from source: magic vars 37031 1727204385.33506: starting attempt loop 37031 1727204385.33509: running the handler 37031 1727204385.33513: _low_level_execute_command(): starting 37031 1727204385.33516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204385.34250: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.34261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.34272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.34309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.34331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.34336: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.34349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.34354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204385.34363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.34370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.34379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.34382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204385.34391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.34447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.34492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.34529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.36076: stdout chunk (state=3): >>>/root <<< 37031 1727204385.36224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.36228: stdout chunk (state=3): >>><<< 37031 1727204385.36237: stderr chunk (state=3): >>><<< 37031 1727204385.36248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.36258: _low_level_execute_command(): starting 37031 1727204385.36261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240 `" && echo ansible-tmp-1727204385.3624716-37650-107728322890240="` echo /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240 `" ) && sleep 0' 37031 1727204385.36714: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.36723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.36728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.36740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.36774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.36779: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.36789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.36799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.36804: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.36813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.36823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.36830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.36836: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.36841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.36893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.36913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.36962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.38790: stdout chunk (state=3): >>>ansible-tmp-1727204385.3624716-37650-107728322890240=/root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240 <<< 37031 1727204385.38899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.38960: stderr chunk (state=3): >>><<< 37031 1727204385.38971: stdout chunk (state=3): >>><<< 37031 1727204385.38988: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204385.3624716-37650-107728322890240=/root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.39009: variable 'ansible_module_compression' from source: unknown 37031 1727204385.39038: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204385.39052: variable 'ansible_facts' from source: unknown 37031 1727204385.39101: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/AnsiballZ_command.py 37031 1727204385.39194: Sending initial data 37031 1727204385.39197: Sent initial data (156 bytes) 37031 1727204385.39886: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.39889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.39899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.39929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.39935: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.39944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.39953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.39963: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.39971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.39977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204385.39984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.40034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.40049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.40052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.40120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.41823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204385.41868: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204385.41912: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpoamcfl2p /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/AnsiballZ_command.py <<< 37031 1727204385.42109: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204385.42817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.42930: stderr chunk (state=3): >>><<< 37031 1727204385.42933: stdout chunk (state=3): >>><<< 37031 1727204385.42950: done transferring module to remote 37031 1727204385.42960: _low_level_execute_command(): starting 37031 1727204385.42967: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/ /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/AnsiballZ_command.py && sleep 0' 37031 1727204385.43419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.43423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.43477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.43481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.43484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204385.43487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.43536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.43540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.43550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.43602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.45294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.45342: stderr chunk (state=3): >>><<< 37031 1727204385.45345: stdout chunk (state=3): >>><<< 37031 1727204385.45359: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.45362: _low_level_execute_command(): starting 37031 1727204385.45368: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/AnsiballZ_command.py && sleep 0' 37031 1727204385.45803: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.45807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.45839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.45850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.45906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.45918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.45971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.59707: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:59:45.589061", "end": "2024-09-24 14:59:45.595148", "delta": "0:00:00.006087", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204385.60761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204385.60879: stderr chunk (state=3): >>><<< 37031 1727204385.60886: stdout chunk (state=3): >>><<< 37031 1727204385.60906: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-24 14:59:45.589061", "end": "2024-09-24 14:59:45.595148", "delta": "0:00:00.006087", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204385.60938: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204385.60945: _low_level_execute_command(): starting 37031 1727204385.60948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204385.3624716-37650-107728322890240/ > /dev/null 2>&1 && sleep 0' 37031 1727204385.61429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.61435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.61488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.61491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204385.61493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.61495: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.61553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.61556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.61558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.61602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.63384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.63468: stderr chunk (state=3): >>><<< 37031 1727204385.63475: stdout chunk (state=3): >>><<< 37031 1727204385.63495: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.63500: handler run complete 37031 1727204385.63523: Evaluated conditional (False): False 37031 1727204385.63534: attempt loop complete, returning result 37031 1727204385.63566: variable 'item' from source: unknown 37031 1727204385.63646: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.006087", "end": "2024-09-24 14:59:45.595148", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-24 14:59:45.589061" } 37031 1727204385.63778: dumping result to json 37031 1727204385.63782: done dumping result, returning 37031 1727204385.63785: done running TaskExecutor() for managed-node2/TASK: Create veth interface veth0 [0affcd87-79f5-b754-dfb8-00000000015a] 37031 1727204385.63787: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015a 37031 1727204385.63838: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015a 37031 1727204385.63841: WORKER PROCESS EXITING 37031 1727204385.63974: no more pending results, returning what we have 37031 1727204385.63979: results queue empty 37031 1727204385.63980: checking for any_errors_fatal 37031 1727204385.63985: done checking for any_errors_fatal 37031 1727204385.63986: checking for max_fail_percentage 37031 1727204385.63988: done checking for max_fail_percentage 37031 1727204385.63989: checking to see if all hosts have failed and the running result is not ok 37031 1727204385.63989: done checking to see if all hosts have failed 37031 1727204385.63990: getting the remaining hosts for this loop 37031 1727204385.63992: done getting the remaining hosts for this loop 37031 1727204385.63996: getting the next task for host managed-node2 37031 1727204385.64002: done getting next task for host managed-node2 37031 1727204385.64005: ^ task is: TASK: Set up veth as managed by NetworkManager 37031 1727204385.64008: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204385.64012: getting variables 37031 1727204385.64014: in VariableManager get_vars() 37031 1727204385.64059: Calling all_inventory to load vars for managed-node2 37031 1727204385.64062: Calling groups_inventory to load vars for managed-node2 37031 1727204385.64067: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204385.64078: Calling all_plugins_play to load vars for managed-node2 37031 1727204385.64081: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204385.64084: Calling groups_plugins_play to load vars for managed-node2 37031 1727204385.64293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204385.64511: done with get_vars() 37031 1727204385.64523: done getting variables 37031 1727204385.64702: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:59:45 -0400 (0:00:01.130) 0:00:08.192 ***** 37031 1727204385.64732: entering _queue_task() for managed-node2/command 37031 1727204385.65128: worker is 1 (out of 1 available) 37031 1727204385.65145: exiting _queue_task() for managed-node2/command 37031 1727204385.65161: done queuing things up, now waiting for results queue to drain 37031 1727204385.65162: waiting for pending results... 37031 1727204385.65418: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 37031 1727204385.65509: in run() - task 0affcd87-79f5-b754-dfb8-00000000015b 37031 1727204385.65521: variable 'ansible_search_path' from source: unknown 37031 1727204385.65525: variable 'ansible_search_path' from source: unknown 37031 1727204385.65563: calling self._execute() 37031 1727204385.65647: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.65651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.65663: variable 'omit' from source: magic vars 37031 1727204385.66040: variable 'ansible_distribution_major_version' from source: facts 37031 1727204385.66053: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204385.66227: variable 'type' from source: play vars 37031 1727204385.66237: variable 'state' from source: include params 37031 1727204385.66243: Evaluated conditional (type == 'veth' and state == 'present'): True 37031 1727204385.66254: variable 'omit' from source: magic vars 37031 1727204385.66298: variable 'omit' from source: magic vars 37031 1727204385.66402: variable 'interface' from source: play vars 37031 1727204385.66418: variable 'omit' from source: magic vars 37031 1727204385.66471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204385.66507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204385.66528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204385.66545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204385.66565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204385.66601: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204385.66604: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.66607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.66712: Set connection var ansible_connection to ssh 37031 1727204385.66716: Set connection var ansible_shell_type to sh 37031 1727204385.66723: Set connection var ansible_pipelining to False 37031 1727204385.66731: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204385.66737: Set connection var ansible_timeout to 10 37031 1727204385.66743: Set connection var ansible_shell_executable to /bin/sh 37031 1727204385.66776: variable 'ansible_shell_executable' from source: unknown 37031 1727204385.66782: variable 'ansible_connection' from source: unknown 37031 1727204385.66784: variable 'ansible_module_compression' from source: unknown 37031 1727204385.66787: variable 'ansible_shell_type' from source: unknown 37031 1727204385.66791: variable 'ansible_shell_executable' from source: unknown 37031 1727204385.66799: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204385.66803: variable 'ansible_pipelining' from source: unknown 37031 1727204385.66806: variable 'ansible_timeout' from source: unknown 37031 1727204385.66810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204385.66958: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204385.66971: variable 'omit' from source: magic vars 37031 1727204385.66976: starting attempt loop 37031 1727204385.66978: running the handler 37031 1727204385.66992: _low_level_execute_command(): starting 37031 1727204385.67004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204385.67834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.67848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.67868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.67885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.67929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.67936: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.67946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.67965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.67974: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.67984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.67994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.68008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.68020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.68027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.68035: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.68044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.68128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.68148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.68167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.68241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.69798: stdout chunk (state=3): >>>/root <<< 37031 1727204385.69978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.69981: stdout chunk (state=3): >>><<< 37031 1727204385.69991: stderr chunk (state=3): >>><<< 37031 1727204385.70015: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.70031: _low_level_execute_command(): starting 37031 1727204385.70045: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401 `" && echo ansible-tmp-1727204385.7001626-37757-227433345490401="` echo /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401 `" ) && sleep 0' 37031 1727204385.70711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.70721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.70731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.70745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.70791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.70798: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.70809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.70821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.70828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.70835: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.70843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.70851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.70867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.70875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.70882: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.70892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.70975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.70983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.70986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.71060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.72919: stdout chunk (state=3): >>>ansible-tmp-1727204385.7001626-37757-227433345490401=/root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401 <<< 37031 1727204385.73113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.73117: stderr chunk (state=3): >>><<< 37031 1727204385.73119: stdout chunk (state=3): >>><<< 37031 1727204385.73141: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204385.7001626-37757-227433345490401=/root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.73184: variable 'ansible_module_compression' from source: unknown 37031 1727204385.73233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204385.73273: variable 'ansible_facts' from source: unknown 37031 1727204385.73354: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/AnsiballZ_command.py 37031 1727204385.73508: Sending initial data 37031 1727204385.73511: Sent initial data (156 bytes) 37031 1727204385.74480: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.74489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.74499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.74512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.74554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.74563: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.74574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.74588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.74595: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.74602: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.74609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.74618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.74629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.74636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.74643: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.74652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.74730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.74744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.74754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.74822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.76525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204385.76561: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204385.76600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpta554rwg /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/AnsiballZ_command.py <<< 37031 1727204385.76637: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204385.77734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.77939: stderr chunk (state=3): >>><<< 37031 1727204385.77942: stdout chunk (state=3): >>><<< 37031 1727204385.77969: done transferring module to remote 37031 1727204385.77980: _low_level_execute_command(): starting 37031 1727204385.77985: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/ /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/AnsiballZ_command.py && sleep 0' 37031 1727204385.79156: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.79173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.79188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.79208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.79252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.79266: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.79281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.79300: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.79312: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.79326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.79339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.79354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.79374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.79387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.79399: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.79416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.79500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.79519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.79537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.79614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.81316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204385.81392: stderr chunk (state=3): >>><<< 37031 1727204385.81394: stdout chunk (state=3): >>><<< 37031 1727204385.81497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204385.81500: _low_level_execute_command(): starting 37031 1727204385.81503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/AnsiballZ_command.py && sleep 0' 37031 1727204385.82131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204385.82152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.82174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.82194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.82238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.82251: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204385.82275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.82294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204385.82307: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204385.82319: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204385.82331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204385.82345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204385.82362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204385.82381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204385.82394: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204385.82408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204385.82489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204385.82513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204385.82530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204385.82613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204385.98356: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:59:45.960908", "end": "2024-09-24 14:59:45.982706", "delta": "0:00:00.021798", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204385.99589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204385.99646: stderr chunk (state=3): >>><<< 37031 1727204385.99650: stdout chunk (state=3): >>><<< 37031 1727204385.99669: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-24 14:59:45.960908", "end": "2024-09-24 14:59:45.982706", "delta": "0:00:00.021798", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204385.99706: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204385.99713: _low_level_execute_command(): starting 37031 1727204385.99718: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204385.7001626-37757-227433345490401/ > /dev/null 2>&1 && sleep 0' 37031 1727204386.00187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.00191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.00227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.00233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.00245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.00251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204386.00259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.00317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.00320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.00332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.00389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.02176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.02233: stderr chunk (state=3): >>><<< 37031 1727204386.02239: stdout chunk (state=3): >>><<< 37031 1727204386.02254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.02263: handler run complete 37031 1727204386.02284: Evaluated conditional (False): False 37031 1727204386.02292: attempt loop complete, returning result 37031 1727204386.02295: _execute() done 37031 1727204386.02297: dumping result to json 37031 1727204386.02303: done dumping result, returning 37031 1727204386.02314: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-b754-dfb8-00000000015b] 37031 1727204386.02317: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015b 37031 1727204386.02412: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015b 37031 1727204386.02415: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.021798", "end": "2024-09-24 14:59:45.982706", "rc": 0, "start": "2024-09-24 14:59:45.960908" } 37031 1727204386.02478: no more pending results, returning what we have 37031 1727204386.02481: results queue empty 37031 1727204386.02482: checking for any_errors_fatal 37031 1727204386.02489: done checking for any_errors_fatal 37031 1727204386.02490: checking for max_fail_percentage 37031 1727204386.02492: done checking for max_fail_percentage 37031 1727204386.02493: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.02494: done checking to see if all hosts have failed 37031 1727204386.02494: getting the remaining hosts for this loop 37031 1727204386.02496: done getting the remaining hosts for this loop 37031 1727204386.02500: getting the next task for host managed-node2 37031 1727204386.02506: done getting next task for host managed-node2 37031 1727204386.02508: ^ task is: TASK: Delete veth interface {{ interface }} 37031 1727204386.02511: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.02516: getting variables 37031 1727204386.02518: in VariableManager get_vars() 37031 1727204386.02561: Calling all_inventory to load vars for managed-node2 37031 1727204386.02566: Calling groups_inventory to load vars for managed-node2 37031 1727204386.02568: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.02578: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.02580: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.02583: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.02706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.02823: done with get_vars() 37031 1727204386.02832: done getting variables 37031 1727204386.02882: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204386.02972: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.382) 0:00:08.574 ***** 37031 1727204386.02999: entering _queue_task() for managed-node2/command 37031 1727204386.03194: worker is 1 (out of 1 available) 37031 1727204386.03207: exiting _queue_task() for managed-node2/command 37031 1727204386.03221: done queuing things up, now waiting for results queue to drain 37031 1727204386.03222: waiting for pending results... 37031 1727204386.03380: running TaskExecutor() for managed-node2/TASK: Delete veth interface veth0 37031 1727204386.03449: in run() - task 0affcd87-79f5-b754-dfb8-00000000015c 37031 1727204386.03463: variable 'ansible_search_path' from source: unknown 37031 1727204386.03469: variable 'ansible_search_path' from source: unknown 37031 1727204386.03500: calling self._execute() 37031 1727204386.03560: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.03567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.03575: variable 'omit' from source: magic vars 37031 1727204386.03829: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.03839: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.04022: variable 'type' from source: play vars 37031 1727204386.04025: variable 'state' from source: include params 37031 1727204386.04028: variable 'interface' from source: play vars 37031 1727204386.04034: variable 'current_interfaces' from source: set_fact 37031 1727204386.04043: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 37031 1727204386.04046: when evaluation is False, skipping this task 37031 1727204386.04049: _execute() done 37031 1727204386.04051: dumping result to json 37031 1727204386.04053: done dumping result, returning 37031 1727204386.04061: done running TaskExecutor() for managed-node2/TASK: Delete veth interface veth0 [0affcd87-79f5-b754-dfb8-00000000015c] 37031 1727204386.04066: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015c 37031 1727204386.04149: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015c 37031 1727204386.04151: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204386.04199: no more pending results, returning what we have 37031 1727204386.04202: results queue empty 37031 1727204386.04203: checking for any_errors_fatal 37031 1727204386.04212: done checking for any_errors_fatal 37031 1727204386.04212: checking for max_fail_percentage 37031 1727204386.04214: done checking for max_fail_percentage 37031 1727204386.04215: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.04216: done checking to see if all hosts have failed 37031 1727204386.04216: getting the remaining hosts for this loop 37031 1727204386.04218: done getting the remaining hosts for this loop 37031 1727204386.04222: getting the next task for host managed-node2 37031 1727204386.04227: done getting next task for host managed-node2 37031 1727204386.04229: ^ task is: TASK: Create dummy interface {{ interface }} 37031 1727204386.04232: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.04235: getting variables 37031 1727204386.04236: in VariableManager get_vars() 37031 1727204386.04279: Calling all_inventory to load vars for managed-node2 37031 1727204386.04282: Calling groups_inventory to load vars for managed-node2 37031 1727204386.04284: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.04293: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.04294: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.04296: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.04436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.04553: done with get_vars() 37031 1727204386.04561: done getting variables 37031 1727204386.04605: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204386.04684: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.017) 0:00:08.591 ***** 37031 1727204386.04710: entering _queue_task() for managed-node2/command 37031 1727204386.04895: worker is 1 (out of 1 available) 37031 1727204386.04910: exiting _queue_task() for managed-node2/command 37031 1727204386.04922: done queuing things up, now waiting for results queue to drain 37031 1727204386.04923: waiting for pending results... 37031 1727204386.05077: running TaskExecutor() for managed-node2/TASK: Create dummy interface veth0 37031 1727204386.05145: in run() - task 0affcd87-79f5-b754-dfb8-00000000015d 37031 1727204386.05153: variable 'ansible_search_path' from source: unknown 37031 1727204386.05159: variable 'ansible_search_path' from source: unknown 37031 1727204386.05190: calling self._execute() 37031 1727204386.05247: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.05251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.05261: variable 'omit' from source: magic vars 37031 1727204386.05510: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.05519: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.05650: variable 'type' from source: play vars 37031 1727204386.05653: variable 'state' from source: include params 37031 1727204386.05659: variable 'interface' from source: play vars 37031 1727204386.05662: variable 'current_interfaces' from source: set_fact 37031 1727204386.05671: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 37031 1727204386.05674: when evaluation is False, skipping this task 37031 1727204386.05677: _execute() done 37031 1727204386.05680: dumping result to json 37031 1727204386.05683: done dumping result, returning 37031 1727204386.05692: done running TaskExecutor() for managed-node2/TASK: Create dummy interface veth0 [0affcd87-79f5-b754-dfb8-00000000015d] 37031 1727204386.05695: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015d 37031 1727204386.05777: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015d 37031 1727204386.05780: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204386.05846: no more pending results, returning what we have 37031 1727204386.05849: results queue empty 37031 1727204386.05850: checking for any_errors_fatal 37031 1727204386.05854: done checking for any_errors_fatal 37031 1727204386.05855: checking for max_fail_percentage 37031 1727204386.05856: done checking for max_fail_percentage 37031 1727204386.05857: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.05858: done checking to see if all hosts have failed 37031 1727204386.05859: getting the remaining hosts for this loop 37031 1727204386.05860: done getting the remaining hosts for this loop 37031 1727204386.05866: getting the next task for host managed-node2 37031 1727204386.05871: done getting next task for host managed-node2 37031 1727204386.05874: ^ task is: TASK: Delete dummy interface {{ interface }} 37031 1727204386.05876: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.05880: getting variables 37031 1727204386.05881: in VariableManager get_vars() 37031 1727204386.05915: Calling all_inventory to load vars for managed-node2 37031 1727204386.05917: Calling groups_inventory to load vars for managed-node2 37031 1727204386.05918: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.05925: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.05926: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.05928: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.06037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.06150: done with get_vars() 37031 1727204386.06158: done getting variables 37031 1727204386.06199: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204386.06283: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.015) 0:00:08.607 ***** 37031 1727204386.06303: entering _queue_task() for managed-node2/command 37031 1727204386.06484: worker is 1 (out of 1 available) 37031 1727204386.06496: exiting _queue_task() for managed-node2/command 37031 1727204386.06510: done queuing things up, now waiting for results queue to drain 37031 1727204386.06511: waiting for pending results... 37031 1727204386.06660: running TaskExecutor() for managed-node2/TASK: Delete dummy interface veth0 37031 1727204386.06729: in run() - task 0affcd87-79f5-b754-dfb8-00000000015e 37031 1727204386.06739: variable 'ansible_search_path' from source: unknown 37031 1727204386.06742: variable 'ansible_search_path' from source: unknown 37031 1727204386.06775: calling self._execute() 37031 1727204386.06832: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.06836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.06844: variable 'omit' from source: magic vars 37031 1727204386.07100: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.07110: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.07243: variable 'type' from source: play vars 37031 1727204386.07247: variable 'state' from source: include params 37031 1727204386.07250: variable 'interface' from source: play vars 37031 1727204386.07252: variable 'current_interfaces' from source: set_fact 37031 1727204386.07262: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 37031 1727204386.07267: when evaluation is False, skipping this task 37031 1727204386.07270: _execute() done 37031 1727204386.07272: dumping result to json 37031 1727204386.07276: done dumping result, returning 37031 1727204386.07280: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface veth0 [0affcd87-79f5-b754-dfb8-00000000015e] 37031 1727204386.07287: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015e 37031 1727204386.07365: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015e 37031 1727204386.07368: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204386.07425: no more pending results, returning what we have 37031 1727204386.07428: results queue empty 37031 1727204386.07429: checking for any_errors_fatal 37031 1727204386.07433: done checking for any_errors_fatal 37031 1727204386.07434: checking for max_fail_percentage 37031 1727204386.07435: done checking for max_fail_percentage 37031 1727204386.07436: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.07437: done checking to see if all hosts have failed 37031 1727204386.07438: getting the remaining hosts for this loop 37031 1727204386.07439: done getting the remaining hosts for this loop 37031 1727204386.07443: getting the next task for host managed-node2 37031 1727204386.07448: done getting next task for host managed-node2 37031 1727204386.07450: ^ task is: TASK: Create tap interface {{ interface }} 37031 1727204386.07455: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.07458: getting variables 37031 1727204386.07459: in VariableManager get_vars() 37031 1727204386.07491: Calling all_inventory to load vars for managed-node2 37031 1727204386.07493: Calling groups_inventory to load vars for managed-node2 37031 1727204386.07494: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.07501: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.07503: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.07505: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.07642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.07758: done with get_vars() 37031 1727204386.07767: done getting variables 37031 1727204386.07806: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204386.07883: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.015) 0:00:08.623 ***** 37031 1727204386.07902: entering _queue_task() for managed-node2/command 37031 1727204386.08074: worker is 1 (out of 1 available) 37031 1727204386.08088: exiting _queue_task() for managed-node2/command 37031 1727204386.08099: done queuing things up, now waiting for results queue to drain 37031 1727204386.08101: waiting for pending results... 37031 1727204386.08242: running TaskExecutor() for managed-node2/TASK: Create tap interface veth0 37031 1727204386.08304: in run() - task 0affcd87-79f5-b754-dfb8-00000000015f 37031 1727204386.08318: variable 'ansible_search_path' from source: unknown 37031 1727204386.08323: variable 'ansible_search_path' from source: unknown 37031 1727204386.08348: calling self._execute() 37031 1727204386.08404: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.08408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.08419: variable 'omit' from source: magic vars 37031 1727204386.08675: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.08685: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.08816: variable 'type' from source: play vars 37031 1727204386.08820: variable 'state' from source: include params 37031 1727204386.08823: variable 'interface' from source: play vars 37031 1727204386.08828: variable 'current_interfaces' from source: set_fact 37031 1727204386.08835: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 37031 1727204386.08838: when evaluation is False, skipping this task 37031 1727204386.08840: _execute() done 37031 1727204386.08843: dumping result to json 37031 1727204386.08847: done dumping result, returning 37031 1727204386.08851: done running TaskExecutor() for managed-node2/TASK: Create tap interface veth0 [0affcd87-79f5-b754-dfb8-00000000015f] 37031 1727204386.08861: sending task result for task 0affcd87-79f5-b754-dfb8-00000000015f 37031 1727204386.08939: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000015f 37031 1727204386.08941: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204386.09013: no more pending results, returning what we have 37031 1727204386.09016: results queue empty 37031 1727204386.09017: checking for any_errors_fatal 37031 1727204386.09021: done checking for any_errors_fatal 37031 1727204386.09022: checking for max_fail_percentage 37031 1727204386.09023: done checking for max_fail_percentage 37031 1727204386.09024: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.09025: done checking to see if all hosts have failed 37031 1727204386.09025: getting the remaining hosts for this loop 37031 1727204386.09026: done getting the remaining hosts for this loop 37031 1727204386.09030: getting the next task for host managed-node2 37031 1727204386.09034: done getting next task for host managed-node2 37031 1727204386.09037: ^ task is: TASK: Delete tap interface {{ interface }} 37031 1727204386.09039: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.09042: getting variables 37031 1727204386.09043: in VariableManager get_vars() 37031 1727204386.09073: Calling all_inventory to load vars for managed-node2 37031 1727204386.09075: Calling groups_inventory to load vars for managed-node2 37031 1727204386.09076: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.09083: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.09085: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.09086: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.09192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.09310: done with get_vars() 37031 1727204386.09316: done getting variables 37031 1727204386.09357: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204386.09434: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.015) 0:00:08.639 ***** 37031 1727204386.09457: entering _queue_task() for managed-node2/command 37031 1727204386.09630: worker is 1 (out of 1 available) 37031 1727204386.09645: exiting _queue_task() for managed-node2/command 37031 1727204386.09660: done queuing things up, now waiting for results queue to drain 37031 1727204386.09662: waiting for pending results... 37031 1727204386.09808: running TaskExecutor() for managed-node2/TASK: Delete tap interface veth0 37031 1727204386.09878: in run() - task 0affcd87-79f5-b754-dfb8-000000000160 37031 1727204386.09889: variable 'ansible_search_path' from source: unknown 37031 1727204386.09893: variable 'ansible_search_path' from source: unknown 37031 1727204386.09921: calling self._execute() 37031 1727204386.09982: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.09986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.09995: variable 'omit' from source: magic vars 37031 1727204386.10238: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.10248: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.10383: variable 'type' from source: play vars 37031 1727204386.10386: variable 'state' from source: include params 37031 1727204386.10389: variable 'interface' from source: play vars 37031 1727204386.10392: variable 'current_interfaces' from source: set_fact 37031 1727204386.10396: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 37031 1727204386.10399: when evaluation is False, skipping this task 37031 1727204386.10401: _execute() done 37031 1727204386.10405: dumping result to json 37031 1727204386.10407: done dumping result, returning 37031 1727204386.10414: done running TaskExecutor() for managed-node2/TASK: Delete tap interface veth0 [0affcd87-79f5-b754-dfb8-000000000160] 37031 1727204386.10418: sending task result for task 0affcd87-79f5-b754-dfb8-000000000160 37031 1727204386.10503: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000160 37031 1727204386.10505: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204386.10582: no more pending results, returning what we have 37031 1727204386.10585: results queue empty 37031 1727204386.10586: checking for any_errors_fatal 37031 1727204386.10591: done checking for any_errors_fatal 37031 1727204386.10592: checking for max_fail_percentage 37031 1727204386.10593: done checking for max_fail_percentage 37031 1727204386.10594: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.10595: done checking to see if all hosts have failed 37031 1727204386.10595: getting the remaining hosts for this loop 37031 1727204386.10596: done getting the remaining hosts for this loop 37031 1727204386.10600: getting the next task for host managed-node2 37031 1727204386.10606: done getting next task for host managed-node2 37031 1727204386.10608: ^ task is: TASK: Set up gateway ip on veth peer 37031 1727204386.10610: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.10612: getting variables 37031 1727204386.10613: in VariableManager get_vars() 37031 1727204386.10644: Calling all_inventory to load vars for managed-node2 37031 1727204386.10646: Calling groups_inventory to load vars for managed-node2 37031 1727204386.10647: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.10656: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.10658: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.10660: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.10806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.10921: done with get_vars() 37031 1727204386.10928: done getting variables 37031 1727204386.10998: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.015) 0:00:08.655 ***** 37031 1727204386.11017: entering _queue_task() for managed-node2/shell 37031 1727204386.11018: Creating lock for shell 37031 1727204386.11205: worker is 1 (out of 1 available) 37031 1727204386.11219: exiting _queue_task() for managed-node2/shell 37031 1727204386.11230: done queuing things up, now waiting for results queue to drain 37031 1727204386.11233: waiting for pending results... 37031 1727204386.11381: running TaskExecutor() for managed-node2/TASK: Set up gateway ip on veth peer 37031 1727204386.11437: in run() - task 0affcd87-79f5-b754-dfb8-00000000000d 37031 1727204386.11446: variable 'ansible_search_path' from source: unknown 37031 1727204386.11478: calling self._execute() 37031 1727204386.11538: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.11542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.11549: variable 'omit' from source: magic vars 37031 1727204386.11807: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.11817: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.11822: variable 'omit' from source: magic vars 37031 1727204386.11847: variable 'omit' from source: magic vars 37031 1727204386.11939: variable 'interface' from source: play vars 37031 1727204386.11992: variable 'omit' from source: magic vars 37031 1727204386.12006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204386.12034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204386.12055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204386.12069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.12078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.12101: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204386.12103: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.12106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.12177: Set connection var ansible_connection to ssh 37031 1727204386.12180: Set connection var ansible_shell_type to sh 37031 1727204386.12186: Set connection var ansible_pipelining to False 37031 1727204386.12193: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204386.12199: Set connection var ansible_timeout to 10 37031 1727204386.12204: Set connection var ansible_shell_executable to /bin/sh 37031 1727204386.12224: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.12227: variable 'ansible_connection' from source: unknown 37031 1727204386.12230: variable 'ansible_module_compression' from source: unknown 37031 1727204386.12232: variable 'ansible_shell_type' from source: unknown 37031 1727204386.12234: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.12237: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.12239: variable 'ansible_pipelining' from source: unknown 37031 1727204386.12243: variable 'ansible_timeout' from source: unknown 37031 1727204386.12247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.12348: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204386.12359: variable 'omit' from source: magic vars 37031 1727204386.12363: starting attempt loop 37031 1727204386.12368: running the handler 37031 1727204386.12378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204386.12392: _low_level_execute_command(): starting 37031 1727204386.12400: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204386.12934: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.12955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.13036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.13107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.13124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.13148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.13227: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.14772: stdout chunk (state=3): >>>/root <<< 37031 1727204386.14872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.14923: stderr chunk (state=3): >>><<< 37031 1727204386.14926: stdout chunk (state=3): >>><<< 37031 1727204386.14945: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.14959: _low_level_execute_command(): starting 37031 1727204386.14962: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253 `" && echo ansible-tmp-1727204386.1494646-37788-80381866232253="` echo /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253 `" ) && sleep 0' 37031 1727204386.15425: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.15428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.15462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.15469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.15521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.15524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.15571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.17413: stdout chunk (state=3): >>>ansible-tmp-1727204386.1494646-37788-80381866232253=/root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253 <<< 37031 1727204386.17519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.17608: stderr chunk (state=3): >>><<< 37031 1727204386.17619: stdout chunk (state=3): >>><<< 37031 1727204386.17949: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204386.1494646-37788-80381866232253=/root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.17953: variable 'ansible_module_compression' from source: unknown 37031 1727204386.17955: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204386.17957: variable 'ansible_facts' from source: unknown 37031 1727204386.17959: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/AnsiballZ_command.py 37031 1727204386.18024: Sending initial data 37031 1727204386.18027: Sent initial data (155 bytes) 37031 1727204386.19026: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204386.19042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.19056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.19077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.19120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.19134: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204386.19151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.19170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204386.19181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204386.19191: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204386.19201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.19213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.19226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.19238: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.19252: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204386.19268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.19341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.19372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.19392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.19467: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.21178: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204386.21212: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204386.21247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp16opws0h /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/AnsiballZ_command.py <<< 37031 1727204386.21298: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204386.22467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.22552: stderr chunk (state=3): >>><<< 37031 1727204386.22556: stdout chunk (state=3): >>><<< 37031 1727204386.22661: done transferring module to remote 37031 1727204386.22666: _low_level_execute_command(): starting 37031 1727204386.22669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/ /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/AnsiballZ_command.py && sleep 0' 37031 1727204386.23267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204386.23283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.23298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.23318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.23367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.23381: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204386.23395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.23417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204386.23432: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204386.23443: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204386.23458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.23476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.23492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.23504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.23516: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204386.23533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.23614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.23649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.23674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.23739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.25435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.25519: stderr chunk (state=3): >>><<< 37031 1727204386.25530: stdout chunk (state=3): >>><<< 37031 1727204386.25571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.25574: _low_level_execute_command(): starting 37031 1727204386.25650: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/AnsiballZ_command.py && sleep 0' 37031 1727204386.26252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204386.26273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.26288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.26310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.26352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.26369: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204386.26382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.26400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204386.26414: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204386.26428: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204386.26443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.26459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.26478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.26490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.26501: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204386.26513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.26594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.26615: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.26633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.26717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.42370: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 14:59:46.397273", "end": "2024-09-24 14:59:46.422889", "delta": "0:00:00.025616", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204386.43517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204386.43579: stderr chunk (state=3): >>><<< 37031 1727204386.43582: stdout chunk (state=3): >>><<< 37031 1727204386.43600: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-24 14:59:46.397273", "end": "2024-09-24 14:59:46.422889", "delta": "0:00:00.025616", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204386.43633: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204386.43641: _low_level_execute_command(): starting 37031 1727204386.43645: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204386.1494646-37788-80381866232253/ > /dev/null 2>&1 && sleep 0' 37031 1727204386.44116: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.44120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.44158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.44161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.44165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.44219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.44222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.44228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.44271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.46045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.46104: stderr chunk (state=3): >>><<< 37031 1727204386.46108: stdout chunk (state=3): >>><<< 37031 1727204386.46124: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.46131: handler run complete 37031 1727204386.46148: Evaluated conditional (False): False 37031 1727204386.46159: attempt loop complete, returning result 37031 1727204386.46162: _execute() done 37031 1727204386.46166: dumping result to json 37031 1727204386.46171: done dumping result, returning 37031 1727204386.46178: done running TaskExecutor() for managed-node2/TASK: Set up gateway ip on veth peer [0affcd87-79f5-b754-dfb8-00000000000d] 37031 1727204386.46183: sending task result for task 0affcd87-79f5-b754-dfb8-00000000000d 37031 1727204386.46283: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000000d 37031 1727204386.46286: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.025616", "end": "2024-09-24 14:59:46.422889", "rc": 0, "start": "2024-09-24 14:59:46.397273" } 37031 1727204386.46371: no more pending results, returning what we have 37031 1727204386.46375: results queue empty 37031 1727204386.46375: checking for any_errors_fatal 37031 1727204386.46380: done checking for any_errors_fatal 37031 1727204386.46381: checking for max_fail_percentage 37031 1727204386.46383: done checking for max_fail_percentage 37031 1727204386.46384: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.46384: done checking to see if all hosts have failed 37031 1727204386.46385: getting the remaining hosts for this loop 37031 1727204386.46387: done getting the remaining hosts for this loop 37031 1727204386.46391: getting the next task for host managed-node2 37031 1727204386.46398: done getting next task for host managed-node2 37031 1727204386.46401: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 37031 1727204386.46403: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.46406: getting variables 37031 1727204386.46407: in VariableManager get_vars() 37031 1727204386.46446: Calling all_inventory to load vars for managed-node2 37031 1727204386.46449: Calling groups_inventory to load vars for managed-node2 37031 1727204386.46451: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.46460: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.46462: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.46473: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.46592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.46711: done with get_vars() 37031 1727204386.46719: done getting variables 37031 1727204386.46762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.357) 0:00:09.012 ***** 37031 1727204386.46783: entering _queue_task() for managed-node2/debug 37031 1727204386.46975: worker is 1 (out of 1 available) 37031 1727204386.46990: exiting _queue_task() for managed-node2/debug 37031 1727204386.47002: done queuing things up, now waiting for results queue to drain 37031 1727204386.47003: waiting for pending results... 37031 1727204386.47169: running TaskExecutor() for managed-node2/TASK: TEST: I can configure an interface with static ipv6 config 37031 1727204386.47220: in run() - task 0affcd87-79f5-b754-dfb8-00000000000f 37031 1727204386.47235: variable 'ansible_search_path' from source: unknown 37031 1727204386.47269: calling self._execute() 37031 1727204386.47328: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.47332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.47346: variable 'omit' from source: magic vars 37031 1727204386.47602: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.47612: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.47617: variable 'omit' from source: magic vars 37031 1727204386.47633: variable 'omit' from source: magic vars 37031 1727204386.47657: variable 'omit' from source: magic vars 37031 1727204386.47697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204386.47722: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204386.47738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204386.47750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.47761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.47788: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204386.47791: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.47794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.47860: Set connection var ansible_connection to ssh 37031 1727204386.47863: Set connection var ansible_shell_type to sh 37031 1727204386.47867: Set connection var ansible_pipelining to False 37031 1727204386.47875: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204386.47880: Set connection var ansible_timeout to 10 37031 1727204386.47888: Set connection var ansible_shell_executable to /bin/sh 37031 1727204386.47908: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.47911: variable 'ansible_connection' from source: unknown 37031 1727204386.47914: variable 'ansible_module_compression' from source: unknown 37031 1727204386.47916: variable 'ansible_shell_type' from source: unknown 37031 1727204386.47918: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.47920: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.47923: variable 'ansible_pipelining' from source: unknown 37031 1727204386.47925: variable 'ansible_timeout' from source: unknown 37031 1727204386.47927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.48027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204386.48037: variable 'omit' from source: magic vars 37031 1727204386.48040: starting attempt loop 37031 1727204386.48043: running the handler 37031 1727204386.48081: handler run complete 37031 1727204386.48319: attempt loop complete, returning result 37031 1727204386.48323: _execute() done 37031 1727204386.48327: dumping result to json 37031 1727204386.48334: done dumping result, returning 37031 1727204386.48341: done running TaskExecutor() for managed-node2/TASK: TEST: I can configure an interface with static ipv6 config [0affcd87-79f5-b754-dfb8-00000000000f] 37031 1727204386.48347: sending task result for task 0affcd87-79f5-b754-dfb8-00000000000f 37031 1727204386.48425: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000000f 37031 1727204386.48427: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: ################################################## 37031 1727204386.48476: no more pending results, returning what we have 37031 1727204386.48479: results queue empty 37031 1727204386.48480: checking for any_errors_fatal 37031 1727204386.48487: done checking for any_errors_fatal 37031 1727204386.48488: checking for max_fail_percentage 37031 1727204386.48490: done checking for max_fail_percentage 37031 1727204386.48491: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.48491: done checking to see if all hosts have failed 37031 1727204386.48492: getting the remaining hosts for this loop 37031 1727204386.48494: done getting the remaining hosts for this loop 37031 1727204386.48498: getting the next task for host managed-node2 37031 1727204386.48504: done getting next task for host managed-node2 37031 1727204386.48509: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 37031 1727204386.48512: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.48526: getting variables 37031 1727204386.48527: in VariableManager get_vars() 37031 1727204386.48571: Calling all_inventory to load vars for managed-node2 37031 1727204386.48574: Calling groups_inventory to load vars for managed-node2 37031 1727204386.48576: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.48583: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.48585: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.48587: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.48854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.48972: done with get_vars() 37031 1727204386.48980: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.022) 0:00:09.035 ***** 37031 1727204386.49043: entering _queue_task() for managed-node2/include_tasks 37031 1727204386.49226: worker is 1 (out of 1 available) 37031 1727204386.49241: exiting _queue_task() for managed-node2/include_tasks 37031 1727204386.49257: done queuing things up, now waiting for results queue to drain 37031 1727204386.49258: waiting for pending results... 37031 1727204386.49416: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 37031 1727204386.49500: in run() - task 0affcd87-79f5-b754-dfb8-000000000017 37031 1727204386.49510: variable 'ansible_search_path' from source: unknown 37031 1727204386.49513: variable 'ansible_search_path' from source: unknown 37031 1727204386.49544: calling self._execute() 37031 1727204386.49604: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.49608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.49615: variable 'omit' from source: magic vars 37031 1727204386.49873: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.49882: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.49887: _execute() done 37031 1727204386.49891: dumping result to json 37031 1727204386.49894: done dumping result, returning 37031 1727204386.49900: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-b754-dfb8-000000000017] 37031 1727204386.49905: sending task result for task 0affcd87-79f5-b754-dfb8-000000000017 37031 1727204386.49990: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000017 37031 1727204386.49993: WORKER PROCESS EXITING 37031 1727204386.50056: no more pending results, returning what we have 37031 1727204386.50063: in VariableManager get_vars() 37031 1727204386.50108: Calling all_inventory to load vars for managed-node2 37031 1727204386.50111: Calling groups_inventory to load vars for managed-node2 37031 1727204386.50112: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.50119: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.50120: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.50122: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.50231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.50355: done with get_vars() 37031 1727204386.50361: variable 'ansible_search_path' from source: unknown 37031 1727204386.50362: variable 'ansible_search_path' from source: unknown 37031 1727204386.50388: we have included files to process 37031 1727204386.50389: generating all_blocks data 37031 1727204386.50390: done generating all_blocks data 37031 1727204386.50393: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204386.50394: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204386.50395: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204386.50893: done processing included file 37031 1727204386.50895: iterating over new_blocks loaded from include file 37031 1727204386.50896: in VariableManager get_vars() 37031 1727204386.50929: done with get_vars() 37031 1727204386.50930: filtering new block on tags 37031 1727204386.50943: done filtering new block on tags 37031 1727204386.50944: in VariableManager get_vars() 37031 1727204386.50959: done with get_vars() 37031 1727204386.50960: filtering new block on tags 37031 1727204386.50978: done filtering new block on tags 37031 1727204386.50980: in VariableManager get_vars() 37031 1727204386.50994: done with get_vars() 37031 1727204386.50995: filtering new block on tags 37031 1727204386.51006: done filtering new block on tags 37031 1727204386.51007: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 37031 1727204386.51011: extending task lists for all hosts with included blocks 37031 1727204386.51499: done extending task lists 37031 1727204386.51501: done processing included files 37031 1727204386.51501: results queue empty 37031 1727204386.51501: checking for any_errors_fatal 37031 1727204386.51504: done checking for any_errors_fatal 37031 1727204386.51504: checking for max_fail_percentage 37031 1727204386.51505: done checking for max_fail_percentage 37031 1727204386.51506: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.51506: done checking to see if all hosts have failed 37031 1727204386.51507: getting the remaining hosts for this loop 37031 1727204386.51508: done getting the remaining hosts for this loop 37031 1727204386.51509: getting the next task for host managed-node2 37031 1727204386.51512: done getting next task for host managed-node2 37031 1727204386.51514: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 37031 1727204386.51516: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.51523: getting variables 37031 1727204386.51524: in VariableManager get_vars() 37031 1727204386.51534: Calling all_inventory to load vars for managed-node2 37031 1727204386.51535: Calling groups_inventory to load vars for managed-node2 37031 1727204386.51537: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.51540: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.51541: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.51543: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.51623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.51757: done with get_vars() 37031 1727204386.51763: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.027) 0:00:09.062 ***** 37031 1727204386.51811: entering _queue_task() for managed-node2/setup 37031 1727204386.52285: worker is 1 (out of 1 available) 37031 1727204386.52293: exiting _queue_task() for managed-node2/setup 37031 1727204386.52304: done queuing things up, now waiting for results queue to drain 37031 1727204386.52305: waiting for pending results... 37031 1727204386.52334: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 37031 1727204386.52489: in run() - task 0affcd87-79f5-b754-dfb8-0000000001fc 37031 1727204386.52518: variable 'ansible_search_path' from source: unknown 37031 1727204386.52526: variable 'ansible_search_path' from source: unknown 37031 1727204386.52572: calling self._execute() 37031 1727204386.52662: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.52676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.52688: variable 'omit' from source: magic vars 37031 1727204386.53174: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.53220: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.53396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204386.54961: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204386.55015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204386.55044: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204386.55095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204386.55126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204386.55217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204386.55268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204386.55299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204386.55344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204386.55376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204386.55432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204386.55464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204386.55768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204386.55803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204386.55817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204386.55965: variable '__network_required_facts' from source: role '' defaults 37031 1727204386.55974: variable 'ansible_facts' from source: unknown 37031 1727204386.56061: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 37031 1727204386.56067: when evaluation is False, skipping this task 37031 1727204386.56069: _execute() done 37031 1727204386.56072: dumping result to json 37031 1727204386.56074: done dumping result, returning 37031 1727204386.56082: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-b754-dfb8-0000000001fc] 37031 1727204386.56085: sending task result for task 0affcd87-79f5-b754-dfb8-0000000001fc 37031 1727204386.56177: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000001fc 37031 1727204386.56179: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204386.56241: no more pending results, returning what we have 37031 1727204386.56245: results queue empty 37031 1727204386.56246: checking for any_errors_fatal 37031 1727204386.56247: done checking for any_errors_fatal 37031 1727204386.56248: checking for max_fail_percentage 37031 1727204386.56249: done checking for max_fail_percentage 37031 1727204386.56250: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.56251: done checking to see if all hosts have failed 37031 1727204386.56252: getting the remaining hosts for this loop 37031 1727204386.56256: done getting the remaining hosts for this loop 37031 1727204386.56260: getting the next task for host managed-node2 37031 1727204386.56269: done getting next task for host managed-node2 37031 1727204386.56274: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 37031 1727204386.56277: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.56291: getting variables 37031 1727204386.56292: in VariableManager get_vars() 37031 1727204386.56333: Calling all_inventory to load vars for managed-node2 37031 1727204386.56336: Calling groups_inventory to load vars for managed-node2 37031 1727204386.56337: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.56346: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.56348: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.56350: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.56537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.56768: done with get_vars() 37031 1727204386.56780: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.050) 0:00:09.113 ***** 37031 1727204386.56889: entering _queue_task() for managed-node2/stat 37031 1727204386.57157: worker is 1 (out of 1 available) 37031 1727204386.57171: exiting _queue_task() for managed-node2/stat 37031 1727204386.57185: done queuing things up, now waiting for results queue to drain 37031 1727204386.57186: waiting for pending results... 37031 1727204386.57468: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 37031 1727204386.57618: in run() - task 0affcd87-79f5-b754-dfb8-0000000001fe 37031 1727204386.57640: variable 'ansible_search_path' from source: unknown 37031 1727204386.57647: variable 'ansible_search_path' from source: unknown 37031 1727204386.57689: calling self._execute() 37031 1727204386.57777: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.57787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.57805: variable 'omit' from source: magic vars 37031 1727204386.58184: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.58201: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.58376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204386.58738: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204386.58794: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204386.58834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204386.58875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204386.58971: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204386.59007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204386.59043: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204386.59080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204386.59184: variable '__network_is_ostree' from source: set_fact 37031 1727204386.59197: Evaluated conditional (not __network_is_ostree is defined): False 37031 1727204386.59204: when evaluation is False, skipping this task 37031 1727204386.59216: _execute() done 37031 1727204386.59224: dumping result to json 37031 1727204386.59230: done dumping result, returning 37031 1727204386.59239: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-b754-dfb8-0000000001fe] 37031 1727204386.59246: sending task result for task 0affcd87-79f5-b754-dfb8-0000000001fe 37031 1727204386.59345: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000001fe 37031 1727204386.59351: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 37031 1727204386.59402: no more pending results, returning what we have 37031 1727204386.59405: results queue empty 37031 1727204386.59406: checking for any_errors_fatal 37031 1727204386.59411: done checking for any_errors_fatal 37031 1727204386.59412: checking for max_fail_percentage 37031 1727204386.59413: done checking for max_fail_percentage 37031 1727204386.59414: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.59415: done checking to see if all hosts have failed 37031 1727204386.59416: getting the remaining hosts for this loop 37031 1727204386.59418: done getting the remaining hosts for this loop 37031 1727204386.59422: getting the next task for host managed-node2 37031 1727204386.59429: done getting next task for host managed-node2 37031 1727204386.59433: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 37031 1727204386.59437: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.59457: getting variables 37031 1727204386.59459: in VariableManager get_vars() 37031 1727204386.59507: Calling all_inventory to load vars for managed-node2 37031 1727204386.59510: Calling groups_inventory to load vars for managed-node2 37031 1727204386.59512: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.59523: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.59525: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.59528: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.59774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.60102: done with get_vars() 37031 1727204386.60114: done getting variables 37031 1727204386.60184: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.034) 0:00:09.147 ***** 37031 1727204386.60294: entering _queue_task() for managed-node2/set_fact 37031 1727204386.60666: worker is 1 (out of 1 available) 37031 1727204386.60678: exiting _queue_task() for managed-node2/set_fact 37031 1727204386.60689: done queuing things up, now waiting for results queue to drain 37031 1727204386.60690: waiting for pending results... 37031 1727204386.60969: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 37031 1727204386.61131: in run() - task 0affcd87-79f5-b754-dfb8-0000000001ff 37031 1727204386.61156: variable 'ansible_search_path' from source: unknown 37031 1727204386.61168: variable 'ansible_search_path' from source: unknown 37031 1727204386.61212: calling self._execute() 37031 1727204386.61301: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.61315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.61328: variable 'omit' from source: magic vars 37031 1727204386.61724: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.61747: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.61928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204386.62218: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204386.62272: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204386.62314: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204386.62359: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204386.62460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204386.62493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204386.62531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204386.62574: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204386.62667: variable '__network_is_ostree' from source: set_fact 37031 1727204386.62678: Evaluated conditional (not __network_is_ostree is defined): False 37031 1727204386.62684: when evaluation is False, skipping this task 37031 1727204386.62689: _execute() done 37031 1727204386.62694: dumping result to json 37031 1727204386.62700: done dumping result, returning 37031 1727204386.62708: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-b754-dfb8-0000000001ff] 37031 1727204386.62715: sending task result for task 0affcd87-79f5-b754-dfb8-0000000001ff skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 37031 1727204386.62852: no more pending results, returning what we have 37031 1727204386.62858: results queue empty 37031 1727204386.62859: checking for any_errors_fatal 37031 1727204386.62867: done checking for any_errors_fatal 37031 1727204386.62868: checking for max_fail_percentage 37031 1727204386.62869: done checking for max_fail_percentage 37031 1727204386.62870: checking to see if all hosts have failed and the running result is not ok 37031 1727204386.62871: done checking to see if all hosts have failed 37031 1727204386.62872: getting the remaining hosts for this loop 37031 1727204386.62874: done getting the remaining hosts for this loop 37031 1727204386.62879: getting the next task for host managed-node2 37031 1727204386.62888: done getting next task for host managed-node2 37031 1727204386.62893: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 37031 1727204386.62898: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204386.62912: getting variables 37031 1727204386.62914: in VariableManager get_vars() 37031 1727204386.62968: Calling all_inventory to load vars for managed-node2 37031 1727204386.62971: Calling groups_inventory to load vars for managed-node2 37031 1727204386.62974: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204386.62983: Calling all_plugins_play to load vars for managed-node2 37031 1727204386.62986: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204386.62989: Calling groups_plugins_play to load vars for managed-node2 37031 1727204386.63176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204386.63411: done with get_vars() 37031 1727204386.63424: done getting variables 37031 1727204386.63605: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000001ff 37031 1727204386.63608: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:59:46 -0400 (0:00:00.033) 0:00:09.181 ***** 37031 1727204386.63663: entering _queue_task() for managed-node2/service_facts 37031 1727204386.63705: Creating lock for service_facts 37031 1727204386.64117: worker is 1 (out of 1 available) 37031 1727204386.64128: exiting _queue_task() for managed-node2/service_facts 37031 1727204386.64144: done queuing things up, now waiting for results queue to drain 37031 1727204386.64145: waiting for pending results... 37031 1727204386.64415: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 37031 1727204386.64566: in run() - task 0affcd87-79f5-b754-dfb8-000000000201 37031 1727204386.64592: variable 'ansible_search_path' from source: unknown 37031 1727204386.64600: variable 'ansible_search_path' from source: unknown 37031 1727204386.64638: calling self._execute() 37031 1727204386.64732: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.64743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.64758: variable 'omit' from source: magic vars 37031 1727204386.65136: variable 'ansible_distribution_major_version' from source: facts 37031 1727204386.65152: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204386.65166: variable 'omit' from source: magic vars 37031 1727204386.65245: variable 'omit' from source: magic vars 37031 1727204386.65289: variable 'omit' from source: magic vars 37031 1727204386.65331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204386.65383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204386.65408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204386.65431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.65452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204386.65490: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204386.65498: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.65504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.65597: Set connection var ansible_connection to ssh 37031 1727204386.65603: Set connection var ansible_shell_type to sh 37031 1727204386.65614: Set connection var ansible_pipelining to False 37031 1727204386.65623: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204386.65630: Set connection var ansible_timeout to 10 37031 1727204386.65640: Set connection var ansible_shell_executable to /bin/sh 37031 1727204386.65682: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.65691: variable 'ansible_connection' from source: unknown 37031 1727204386.65698: variable 'ansible_module_compression' from source: unknown 37031 1727204386.65705: variable 'ansible_shell_type' from source: unknown 37031 1727204386.65711: variable 'ansible_shell_executable' from source: unknown 37031 1727204386.65717: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204386.65724: variable 'ansible_pipelining' from source: unknown 37031 1727204386.65730: variable 'ansible_timeout' from source: unknown 37031 1727204386.65737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204386.65968: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204386.65985: variable 'omit' from source: magic vars 37031 1727204386.66000: starting attempt loop 37031 1727204386.66011: running the handler 37031 1727204386.66030: _low_level_execute_command(): starting 37031 1727204386.66043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204386.66848: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204386.66872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.66897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.66916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.66963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.66979: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204386.67001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.67021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204386.67033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204386.67045: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204386.67060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.67078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.67095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.67114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.67127: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204386.67141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.67228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.67252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.67274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.67356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.68941: stdout chunk (state=3): >>>/root <<< 37031 1727204386.69040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.69136: stderr chunk (state=3): >>><<< 37031 1727204386.69148: stdout chunk (state=3): >>><<< 37031 1727204386.69283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.69286: _low_level_execute_command(): starting 37031 1727204386.69290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782 `" && echo ansible-tmp-1727204386.6918154-37811-113159195308782="` echo /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782 `" ) && sleep 0' 37031 1727204386.69929: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204386.69943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.69970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.69989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.70033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.70045: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204386.70063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.70089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204386.70101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204386.70112: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204386.70124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204386.70137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.70152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.70169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204386.70186: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204386.70202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.70283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.70312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.70328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.70398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.72240: stdout chunk (state=3): >>>ansible-tmp-1727204386.6918154-37811-113159195308782=/root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782 <<< 37031 1727204386.72359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204386.72434: stderr chunk (state=3): >>><<< 37031 1727204386.72439: stdout chunk (state=3): >>><<< 37031 1727204386.72461: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204386.6918154-37811-113159195308782=/root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204386.72511: variable 'ansible_module_compression' from source: unknown 37031 1727204386.72556: ANSIBALLZ: Using lock for service_facts 37031 1727204386.72560: ANSIBALLZ: Acquiring lock 37031 1727204386.72562: ANSIBALLZ: Lock acquired: 140694171554336 37031 1727204386.72566: ANSIBALLZ: Creating module 37031 1727204386.94262: ANSIBALLZ: Writing module into payload 37031 1727204386.94380: ANSIBALLZ: Writing module 37031 1727204386.94409: ANSIBALLZ: Renaming module 37031 1727204386.94415: ANSIBALLZ: Done creating module 37031 1727204386.94434: variable 'ansible_facts' from source: unknown 37031 1727204386.94526: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/AnsiballZ_service_facts.py 37031 1727204386.94785: Sending initial data 37031 1727204386.94789: Sent initial data (162 bytes) 37031 1727204386.96849: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204386.96856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.96897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204386.96903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.96921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204386.96927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204386.96940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204386.96944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204386.97032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204386.97039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204386.97044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204386.97113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204386.98905: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204386.98939: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204386.98994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpd39h7yqr /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/AnsiballZ_service_facts.py <<< 37031 1727204386.99028: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204387.00046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204387.00416: stderr chunk (state=3): >>><<< 37031 1727204387.00420: stdout chunk (state=3): >>><<< 37031 1727204387.00422: done transferring module to remote 37031 1727204387.00424: _low_level_execute_command(): starting 37031 1727204387.00427: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/ /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/AnsiballZ_service_facts.py && sleep 0' 37031 1727204387.01292: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204387.01296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204387.01329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204387.01333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204387.01335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204387.01928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204387.01934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204387.01948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204387.03660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204387.03731: stderr chunk (state=3): >>><<< 37031 1727204387.03734: stdout chunk (state=3): >>><<< 37031 1727204387.03752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204387.03758: _low_level_execute_command(): starting 37031 1727204387.03761: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/AnsiballZ_service_facts.py && sleep 0' 37031 1727204387.04567: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204387.04596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204387.04611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204387.05324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204387.05328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204387.05330: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204387.05332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204387.05334: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204387.05336: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204387.05338: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204387.05340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204387.05342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204387.05344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204387.05346: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204387.05348: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204387.05350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204387.05352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204387.05357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204387.05359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204387.05361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.34936: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 37031 1727204388.34974: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 37031 1727204388.34980: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 37031 1727204388.34986: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-up<<< 37031 1727204388.35004: stdout chunk (state=3): >>>date.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 37031 1727204388.36308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204388.36311: stdout chunk (state=3): >>><<< 37031 1727204388.36314: stderr chunk (state=3): >>><<< 37031 1727204388.36374: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204388.36997: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204388.37014: _low_level_execute_command(): starting 37031 1727204388.37024: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204386.6918154-37811-113159195308782/ > /dev/null 2>&1 && sleep 0' 37031 1727204388.37741: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204388.37759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.37778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.37797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.37849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.37866: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204388.37882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.37900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204388.37912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204388.37931: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204388.37944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.37960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.37980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.37993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.38005: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204388.38019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.38108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.38130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204388.38151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.38231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.40071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204388.40074: stdout chunk (state=3): >>><<< 37031 1727204388.40076: stderr chunk (state=3): >>><<< 37031 1727204388.40170: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204388.40173: handler run complete 37031 1727204388.40289: variable 'ansible_facts' from source: unknown 37031 1727204388.40425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204388.40885: variable 'ansible_facts' from source: unknown 37031 1727204388.41010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204388.41195: attempt loop complete, returning result 37031 1727204388.41206: _execute() done 37031 1727204388.41214: dumping result to json 37031 1727204388.41279: done dumping result, returning 37031 1727204388.41293: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-b754-dfb8-000000000201] 37031 1727204388.41302: sending task result for task 0affcd87-79f5-b754-dfb8-000000000201 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204388.42058: no more pending results, returning what we have 37031 1727204388.42061: results queue empty 37031 1727204388.42062: checking for any_errors_fatal 37031 1727204388.42067: done checking for any_errors_fatal 37031 1727204388.42068: checking for max_fail_percentage 37031 1727204388.42070: done checking for max_fail_percentage 37031 1727204388.42071: checking to see if all hosts have failed and the running result is not ok 37031 1727204388.42072: done checking to see if all hosts have failed 37031 1727204388.42073: getting the remaining hosts for this loop 37031 1727204388.42074: done getting the remaining hosts for this loop 37031 1727204388.42078: getting the next task for host managed-node2 37031 1727204388.42085: done getting next task for host managed-node2 37031 1727204388.42089: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 37031 1727204388.42093: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204388.42103: getting variables 37031 1727204388.42104: in VariableManager get_vars() 37031 1727204388.42142: Calling all_inventory to load vars for managed-node2 37031 1727204388.42144: Calling groups_inventory to load vars for managed-node2 37031 1727204388.42147: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204388.42160: Calling all_plugins_play to load vars for managed-node2 37031 1727204388.42163: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204388.42172: Calling groups_plugins_play to load vars for managed-node2 37031 1727204388.42537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204388.43285: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000201 37031 1727204388.43288: WORKER PROCESS EXITING 37031 1727204388.43367: done with get_vars() 37031 1727204388.43381: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:59:48 -0400 (0:00:01.798) 0:00:10.979 ***** 37031 1727204388.43484: entering _queue_task() for managed-node2/package_facts 37031 1727204388.43486: Creating lock for package_facts 37031 1727204388.43761: worker is 1 (out of 1 available) 37031 1727204388.43775: exiting _queue_task() for managed-node2/package_facts 37031 1727204388.43788: done queuing things up, now waiting for results queue to drain 37031 1727204388.43789: waiting for pending results... 37031 1727204388.44069: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 37031 1727204388.44216: in run() - task 0affcd87-79f5-b754-dfb8-000000000202 37031 1727204388.44240: variable 'ansible_search_path' from source: unknown 37031 1727204388.44248: variable 'ansible_search_path' from source: unknown 37031 1727204388.44292: calling self._execute() 37031 1727204388.44381: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204388.44392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204388.44407: variable 'omit' from source: magic vars 37031 1727204388.44716: variable 'ansible_distribution_major_version' from source: facts 37031 1727204388.44730: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204388.44733: variable 'omit' from source: magic vars 37031 1727204388.44785: variable 'omit' from source: magic vars 37031 1727204388.44808: variable 'omit' from source: magic vars 37031 1727204388.44841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204388.44871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204388.44887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204388.44905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204388.44914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204388.44937: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204388.44940: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204388.44945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204388.45018: Set connection var ansible_connection to ssh 37031 1727204388.45022: Set connection var ansible_shell_type to sh 37031 1727204388.45028: Set connection var ansible_pipelining to False 37031 1727204388.45035: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204388.45041: Set connection var ansible_timeout to 10 37031 1727204388.45045: Set connection var ansible_shell_executable to /bin/sh 37031 1727204388.45073: variable 'ansible_shell_executable' from source: unknown 37031 1727204388.45076: variable 'ansible_connection' from source: unknown 37031 1727204388.45079: variable 'ansible_module_compression' from source: unknown 37031 1727204388.45081: variable 'ansible_shell_type' from source: unknown 37031 1727204388.45083: variable 'ansible_shell_executable' from source: unknown 37031 1727204388.45086: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204388.45088: variable 'ansible_pipelining' from source: unknown 37031 1727204388.45090: variable 'ansible_timeout' from source: unknown 37031 1727204388.45093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204388.45238: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204388.45246: variable 'omit' from source: magic vars 37031 1727204388.45251: starting attempt loop 37031 1727204388.45253: running the handler 37031 1727204388.45269: _low_level_execute_command(): starting 37031 1727204388.45278: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204388.45799: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.45823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.45841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.45852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.45898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.45910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.45957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.47495: stdout chunk (state=3): >>>/root <<< 37031 1727204388.47621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204388.47648: stderr chunk (state=3): >>><<< 37031 1727204388.47652: stdout chunk (state=3): >>><<< 37031 1727204388.47677: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204388.47688: _low_level_execute_command(): starting 37031 1727204388.47692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826 `" && echo ansible-tmp-1727204388.4767694-37866-123155265942826="` echo /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826 `" ) && sleep 0' 37031 1727204388.48142: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.48163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.48178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.48190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.48239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.48251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.48310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.50146: stdout chunk (state=3): >>>ansible-tmp-1727204388.4767694-37866-123155265942826=/root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826 <<< 37031 1727204388.50267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204388.50670: stderr chunk (state=3): >>><<< 37031 1727204388.50674: stdout chunk (state=3): >>><<< 37031 1727204388.50677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204388.4767694-37866-123155265942826=/root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204388.50679: variable 'ansible_module_compression' from source: unknown 37031 1727204388.50682: ANSIBALLZ: Using lock for package_facts 37031 1727204388.50684: ANSIBALLZ: Acquiring lock 37031 1727204388.50686: ANSIBALLZ: Lock acquired: 140694169715920 37031 1727204388.50688: ANSIBALLZ: Creating module 37031 1727204388.80170: ANSIBALLZ: Writing module into payload 37031 1727204388.80284: ANSIBALLZ: Writing module 37031 1727204388.80312: ANSIBALLZ: Renaming module 37031 1727204388.80315: ANSIBALLZ: Done creating module 37031 1727204388.80346: variable 'ansible_facts' from source: unknown 37031 1727204388.80487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/AnsiballZ_package_facts.py 37031 1727204388.80605: Sending initial data 37031 1727204388.80615: Sent initial data (162 bytes) 37031 1727204388.81492: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204388.81516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.81532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.81548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.81598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.81615: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204388.81636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.81659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204388.81676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204388.81687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204388.81698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.81710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.81729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.81748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.81762: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204388.81785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.81871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.81897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204388.81917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.81992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.83812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204388.83878: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204388.84120: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpg4yn3g5b /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/AnsiballZ_package_facts.py <<< 37031 1727204388.84124: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204388.85694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204388.85805: stderr chunk (state=3): >>><<< 37031 1727204388.85809: stdout chunk (state=3): >>><<< 37031 1727204388.85826: done transferring module to remote 37031 1727204388.85836: _low_level_execute_command(): starting 37031 1727204388.85841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/ /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/AnsiballZ_package_facts.py && sleep 0' 37031 1727204388.86498: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204388.86512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.86525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.86541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.86589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.86601: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204388.86614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.86631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204388.86643: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204388.86662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204388.86677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204388.86695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.86711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.86723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204388.86735: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204388.86750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.86827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.86849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204388.86877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.86948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204388.88673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204388.88724: stderr chunk (state=3): >>><<< 37031 1727204388.88728: stdout chunk (state=3): >>><<< 37031 1727204388.88741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204388.88744: _low_level_execute_command(): starting 37031 1727204388.88749: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/AnsiballZ_package_facts.py && sleep 0' 37031 1727204388.89267: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.89283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204388.89314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204388.89317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204388.89320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204388.89375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204388.89383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204388.89428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204389.36059: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{<<< 37031 1727204389.36137: stdout chunk (state=3): >>>"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "<<< 37031 1727204389.36162: stdout chunk (state=3): >>>release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm<<< 37031 1727204389.36207: stdout chunk (state=3): >>>"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "<<< 37031 1727204389.36216: stdout chunk (state=3): >>>epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 37031 1727204389.37685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204389.37744: stderr chunk (state=3): >>><<< 37031 1727204389.37748: stdout chunk (state=3): >>><<< 37031 1727204389.37785: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204389.40545: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204389.40591: _low_level_execute_command(): starting 37031 1727204389.40601: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204388.4767694-37866-123155265942826/ > /dev/null 2>&1 && sleep 0' 37031 1727204389.41304: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204389.41328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204389.41343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204389.41360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204389.41405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204389.41418: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204389.41440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204389.41459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204389.41474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204389.41484: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204389.41495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204389.41507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204389.41521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204389.41537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204389.41553: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204389.41569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204389.41651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204389.41673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204389.41688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204389.41767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204389.43641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204389.43644: stdout chunk (state=3): >>><<< 37031 1727204389.43651: stderr chunk (state=3): >>><<< 37031 1727204389.43673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204389.43676: handler run complete 37031 1727204389.44645: variable 'ansible_facts' from source: unknown 37031 1727204389.45115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.47463: variable 'ansible_facts' from source: unknown 37031 1727204389.47935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.48734: attempt loop complete, returning result 37031 1727204389.48746: _execute() done 37031 1727204389.48750: dumping result to json 37031 1727204389.48979: done dumping result, returning 37031 1727204389.48989: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-b754-dfb8-000000000202] 37031 1727204389.48994: sending task result for task 0affcd87-79f5-b754-dfb8-000000000202 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204389.51862: no more pending results, returning what we have 37031 1727204389.51866: results queue empty 37031 1727204389.51867: checking for any_errors_fatal 37031 1727204389.51872: done checking for any_errors_fatal 37031 1727204389.51873: checking for max_fail_percentage 37031 1727204389.51874: done checking for max_fail_percentage 37031 1727204389.51875: checking to see if all hosts have failed and the running result is not ok 37031 1727204389.51876: done checking to see if all hosts have failed 37031 1727204389.51877: getting the remaining hosts for this loop 37031 1727204389.51878: done getting the remaining hosts for this loop 37031 1727204389.51882: getting the next task for host managed-node2 37031 1727204389.51889: done getting next task for host managed-node2 37031 1727204389.51892: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 37031 1727204389.51894: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204389.51904: getting variables 37031 1727204389.51906: in VariableManager get_vars() 37031 1727204389.51941: Calling all_inventory to load vars for managed-node2 37031 1727204389.51944: Calling groups_inventory to load vars for managed-node2 37031 1727204389.51946: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204389.51955: Calling all_plugins_play to load vars for managed-node2 37031 1727204389.51958: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204389.51960: Calling groups_plugins_play to load vars for managed-node2 37031 1727204389.52997: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000202 37031 1727204389.53001: WORKER PROCESS EXITING 37031 1727204389.53215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.55031: done with get_vars() 37031 1727204389.55057: done getting variables 37031 1727204389.55122: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:59:49 -0400 (0:00:01.116) 0:00:12.096 ***** 37031 1727204389.55154: entering _queue_task() for managed-node2/debug 37031 1727204389.55462: worker is 1 (out of 1 available) 37031 1727204389.55475: exiting _queue_task() for managed-node2/debug 37031 1727204389.55487: done queuing things up, now waiting for results queue to drain 37031 1727204389.55488: waiting for pending results... 37031 1727204389.55791: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 37031 1727204389.55939: in run() - task 0affcd87-79f5-b754-dfb8-000000000018 37031 1727204389.55990: variable 'ansible_search_path' from source: unknown 37031 1727204389.55999: variable 'ansible_search_path' from source: unknown 37031 1727204389.56046: calling self._execute() 37031 1727204389.56176: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.56212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.56227: variable 'omit' from source: magic vars 37031 1727204389.56851: variable 'ansible_distribution_major_version' from source: facts 37031 1727204389.56872: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204389.56886: variable 'omit' from source: magic vars 37031 1727204389.56949: variable 'omit' from source: magic vars 37031 1727204389.57070: variable 'network_provider' from source: set_fact 37031 1727204389.57099: variable 'omit' from source: magic vars 37031 1727204389.57150: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204389.57191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204389.57219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204389.57246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204389.57265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204389.57300: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204389.57310: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.57357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.57458: Set connection var ansible_connection to ssh 37031 1727204389.57470: Set connection var ansible_shell_type to sh 37031 1727204389.57483: Set connection var ansible_pipelining to False 37031 1727204389.57580: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204389.57592: Set connection var ansible_timeout to 10 37031 1727204389.57602: Set connection var ansible_shell_executable to /bin/sh 37031 1727204389.57634: variable 'ansible_shell_executable' from source: unknown 37031 1727204389.57643: variable 'ansible_connection' from source: unknown 37031 1727204389.57650: variable 'ansible_module_compression' from source: unknown 37031 1727204389.57658: variable 'ansible_shell_type' from source: unknown 37031 1727204389.57667: variable 'ansible_shell_executable' from source: unknown 37031 1727204389.57676: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.57689: variable 'ansible_pipelining' from source: unknown 37031 1727204389.57696: variable 'ansible_timeout' from source: unknown 37031 1727204389.57704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.57853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204389.57871: variable 'omit' from source: magic vars 37031 1727204389.57880: starting attempt loop 37031 1727204389.57886: running the handler 37031 1727204389.57938: handler run complete 37031 1727204389.57958: attempt loop complete, returning result 37031 1727204389.57966: _execute() done 37031 1727204389.57974: dumping result to json 37031 1727204389.57981: done dumping result, returning 37031 1727204389.57992: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-b754-dfb8-000000000018] 37031 1727204389.57999: sending task result for task 0affcd87-79f5-b754-dfb8-000000000018 37031 1727204389.58108: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000018 ok: [managed-node2] => {} MSG: Using network provider: nm 37031 1727204389.58175: no more pending results, returning what we have 37031 1727204389.58179: results queue empty 37031 1727204389.58180: checking for any_errors_fatal 37031 1727204389.58188: done checking for any_errors_fatal 37031 1727204389.58189: checking for max_fail_percentage 37031 1727204389.58191: done checking for max_fail_percentage 37031 1727204389.58192: checking to see if all hosts have failed and the running result is not ok 37031 1727204389.58193: done checking to see if all hosts have failed 37031 1727204389.58194: getting the remaining hosts for this loop 37031 1727204389.58195: done getting the remaining hosts for this loop 37031 1727204389.58200: getting the next task for host managed-node2 37031 1727204389.58207: done getting next task for host managed-node2 37031 1727204389.58211: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 37031 1727204389.58214: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204389.58227: getting variables 37031 1727204389.58229: in VariableManager get_vars() 37031 1727204389.58272: Calling all_inventory to load vars for managed-node2 37031 1727204389.58276: Calling groups_inventory to load vars for managed-node2 37031 1727204389.58279: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204389.58289: Calling all_plugins_play to load vars for managed-node2 37031 1727204389.58291: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204389.58294: Calling groups_plugins_play to load vars for managed-node2 37031 1727204389.59440: WORKER PROCESS EXITING 37031 1727204389.60199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.62628: done with get_vars() 37031 1727204389.62660: done getting variables 37031 1727204389.63049: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:59:49 -0400 (0:00:00.079) 0:00:12.175 ***** 37031 1727204389.63085: entering _queue_task() for managed-node2/fail 37031 1727204389.63813: worker is 1 (out of 1 available) 37031 1727204389.63854: exiting _queue_task() for managed-node2/fail 37031 1727204389.63949: done queuing things up, now waiting for results queue to drain 37031 1727204389.63951: waiting for pending results... 37031 1727204389.65533: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 37031 1727204389.65673: in run() - task 0affcd87-79f5-b754-dfb8-000000000019 37031 1727204389.65693: variable 'ansible_search_path' from source: unknown 37031 1727204389.65701: variable 'ansible_search_path' from source: unknown 37031 1727204389.65743: calling self._execute() 37031 1727204389.66150: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.66167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.66181: variable 'omit' from source: magic vars 37031 1727204389.66522: variable 'ansible_distribution_major_version' from source: facts 37031 1727204389.66986: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204389.67120: variable 'network_state' from source: role '' defaults 37031 1727204389.67138: Evaluated conditional (network_state != {}): False 37031 1727204389.67146: when evaluation is False, skipping this task 37031 1727204389.67157: _execute() done 37031 1727204389.67168: dumping result to json 37031 1727204389.67176: done dumping result, returning 37031 1727204389.67188: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-b754-dfb8-000000000019] 37031 1727204389.67197: sending task result for task 0affcd87-79f5-b754-dfb8-000000000019 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204389.67359: no more pending results, returning what we have 37031 1727204389.67362: results queue empty 37031 1727204389.67365: checking for any_errors_fatal 37031 1727204389.67372: done checking for any_errors_fatal 37031 1727204389.67373: checking for max_fail_percentage 37031 1727204389.67374: done checking for max_fail_percentage 37031 1727204389.67375: checking to see if all hosts have failed and the running result is not ok 37031 1727204389.67376: done checking to see if all hosts have failed 37031 1727204389.67377: getting the remaining hosts for this loop 37031 1727204389.67378: done getting the remaining hosts for this loop 37031 1727204389.67382: getting the next task for host managed-node2 37031 1727204389.67389: done getting next task for host managed-node2 37031 1727204389.67392: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 37031 1727204389.67395: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204389.67415: getting variables 37031 1727204389.67418: in VariableManager get_vars() 37031 1727204389.67460: Calling all_inventory to load vars for managed-node2 37031 1727204389.67462: Calling groups_inventory to load vars for managed-node2 37031 1727204389.67466: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204389.67479: Calling all_plugins_play to load vars for managed-node2 37031 1727204389.67481: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204389.67485: Calling groups_plugins_play to load vars for managed-node2 37031 1727204389.68100: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000019 37031 1727204389.68104: WORKER PROCESS EXITING 37031 1727204389.70994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.74990: done with get_vars() 37031 1727204389.75025: done getting variables 37031 1727204389.75088: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:59:49 -0400 (0:00:00.120) 0:00:12.296 ***** 37031 1727204389.75124: entering _queue_task() for managed-node2/fail 37031 1727204389.75429: worker is 1 (out of 1 available) 37031 1727204389.75441: exiting _queue_task() for managed-node2/fail 37031 1727204389.75453: done queuing things up, now waiting for results queue to drain 37031 1727204389.75455: waiting for pending results... 37031 1727204389.76116: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 37031 1727204389.76341: in run() - task 0affcd87-79f5-b754-dfb8-00000000001a 37031 1727204389.76357: variable 'ansible_search_path' from source: unknown 37031 1727204389.76361: variable 'ansible_search_path' from source: unknown 37031 1727204389.76402: calling self._execute() 37031 1727204389.76487: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.76504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.76518: variable 'omit' from source: magic vars 37031 1727204389.76894: variable 'ansible_distribution_major_version' from source: facts 37031 1727204389.76912: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204389.77050: variable 'network_state' from source: role '' defaults 37031 1727204389.77068: Evaluated conditional (network_state != {}): False 37031 1727204389.77076: when evaluation is False, skipping this task 37031 1727204389.77083: _execute() done 37031 1727204389.77090: dumping result to json 37031 1727204389.77097: done dumping result, returning 37031 1727204389.77107: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-b754-dfb8-00000000001a] 37031 1727204389.77117: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001a 37031 1727204389.77231: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001a 37031 1727204389.77240: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204389.77303: no more pending results, returning what we have 37031 1727204389.77307: results queue empty 37031 1727204389.77308: checking for any_errors_fatal 37031 1727204389.77316: done checking for any_errors_fatal 37031 1727204389.77317: checking for max_fail_percentage 37031 1727204389.77319: done checking for max_fail_percentage 37031 1727204389.77320: checking to see if all hosts have failed and the running result is not ok 37031 1727204389.77321: done checking to see if all hosts have failed 37031 1727204389.77322: getting the remaining hosts for this loop 37031 1727204389.77324: done getting the remaining hosts for this loop 37031 1727204389.77330: getting the next task for host managed-node2 37031 1727204389.77337: done getting next task for host managed-node2 37031 1727204389.77342: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 37031 1727204389.77345: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204389.77365: getting variables 37031 1727204389.77368: in VariableManager get_vars() 37031 1727204389.77412: Calling all_inventory to load vars for managed-node2 37031 1727204389.77414: Calling groups_inventory to load vars for managed-node2 37031 1727204389.77417: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204389.77429: Calling all_plugins_play to load vars for managed-node2 37031 1727204389.77432: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204389.77434: Calling groups_plugins_play to load vars for managed-node2 37031 1727204389.79237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.81739: done with get_vars() 37031 1727204389.81777: done getting variables 37031 1727204389.81842: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:59:49 -0400 (0:00:00.067) 0:00:12.363 ***** 37031 1727204389.81879: entering _queue_task() for managed-node2/fail 37031 1727204389.83142: worker is 1 (out of 1 available) 37031 1727204389.83155: exiting _queue_task() for managed-node2/fail 37031 1727204389.83173: done queuing things up, now waiting for results queue to drain 37031 1727204389.83174: waiting for pending results... 37031 1727204389.84012: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 37031 1727204389.84238: in run() - task 0affcd87-79f5-b754-dfb8-00000000001b 37031 1727204389.84406: variable 'ansible_search_path' from source: unknown 37031 1727204389.84414: variable 'ansible_search_path' from source: unknown 37031 1727204389.84458: calling self._execute() 37031 1727204389.84662: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204389.84675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204389.84689: variable 'omit' from source: magic vars 37031 1727204389.85377: variable 'ansible_distribution_major_version' from source: facts 37031 1727204389.85487: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204389.85769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204389.90860: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204389.91061: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204389.91106: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204389.91177: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204389.91276: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204389.92367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204389.92802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204389.92833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204389.92881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204389.92900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204389.93002: variable 'ansible_distribution_major_version' from source: facts 37031 1727204389.93027: Evaluated conditional (ansible_distribution_major_version | int > 9): False 37031 1727204389.93035: when evaluation is False, skipping this task 37031 1727204389.93043: _execute() done 37031 1727204389.93050: dumping result to json 37031 1727204389.93061: done dumping result, returning 37031 1727204389.93078: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-b754-dfb8-00000000001b] 37031 1727204389.93089: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001b 37031 1727204389.93207: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001b 37031 1727204389.93215: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 37031 1727204389.93268: no more pending results, returning what we have 37031 1727204389.93271: results queue empty 37031 1727204389.93272: checking for any_errors_fatal 37031 1727204389.93278: done checking for any_errors_fatal 37031 1727204389.93279: checking for max_fail_percentage 37031 1727204389.93281: done checking for max_fail_percentage 37031 1727204389.93282: checking to see if all hosts have failed and the running result is not ok 37031 1727204389.93283: done checking to see if all hosts have failed 37031 1727204389.93283: getting the remaining hosts for this loop 37031 1727204389.93285: done getting the remaining hosts for this loop 37031 1727204389.93289: getting the next task for host managed-node2 37031 1727204389.93296: done getting next task for host managed-node2 37031 1727204389.93300: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 37031 1727204389.93303: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204389.93316: getting variables 37031 1727204389.93318: in VariableManager get_vars() 37031 1727204389.93358: Calling all_inventory to load vars for managed-node2 37031 1727204389.93360: Calling groups_inventory to load vars for managed-node2 37031 1727204389.93362: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204389.93373: Calling all_plugins_play to load vars for managed-node2 37031 1727204389.93375: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204389.93377: Calling groups_plugins_play to load vars for managed-node2 37031 1727204389.95651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204389.99996: done with get_vars() 37031 1727204390.00030: done getting variables 37031 1727204390.00147: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.183) 0:00:12.546 ***** 37031 1727204390.00191: entering _queue_task() for managed-node2/dnf 37031 1727204390.00585: worker is 1 (out of 1 available) 37031 1727204390.00603: exiting _queue_task() for managed-node2/dnf 37031 1727204390.00641: done queuing things up, now waiting for results queue to drain 37031 1727204390.00647: waiting for pending results... 37031 1727204390.02600: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 37031 1727204390.02750: in run() - task 0affcd87-79f5-b754-dfb8-00000000001c 37031 1727204390.02780: variable 'ansible_search_path' from source: unknown 37031 1727204390.02789: variable 'ansible_search_path' from source: unknown 37031 1727204390.02908: calling self._execute() 37031 1727204390.02999: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.03284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.03684: variable 'omit' from source: magic vars 37031 1727204390.04256: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.04783: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.05353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204390.10884: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204390.10983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204390.11037: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204390.11084: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204390.11122: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204390.11216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.11370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.11402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.11489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.11575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.11907: variable 'ansible_distribution' from source: facts 37031 1727204390.11917: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.11938: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 37031 1727204390.12184: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204390.12457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.12561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.12595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.12688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.12779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.12849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.12885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.12914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.12977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.12999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.13043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.13081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.13110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.13158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.13186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.13361: variable 'network_connections' from source: task vars 37031 1727204390.13380: variable 'interface' from source: play vars 37031 1727204390.13463: variable 'interface' from source: play vars 37031 1727204390.13547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204390.13725: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204390.13769: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204390.13815: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204390.13850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204390.13900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204390.13923: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204390.13970: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.13999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204390.14067: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204390.14318: variable 'network_connections' from source: task vars 37031 1727204390.14327: variable 'interface' from source: play vars 37031 1727204390.14392: variable 'interface' from source: play vars 37031 1727204390.14429: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204390.14452: when evaluation is False, skipping this task 37031 1727204390.14463: _execute() done 37031 1727204390.14473: dumping result to json 37031 1727204390.14485: done dumping result, returning 37031 1727204390.14497: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-00000000001c] 37031 1727204390.14506: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001c 37031 1727204390.14745: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001c 37031 1727204390.14753: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204390.14816: no more pending results, returning what we have 37031 1727204390.14819: results queue empty 37031 1727204390.14820: checking for any_errors_fatal 37031 1727204390.14826: done checking for any_errors_fatal 37031 1727204390.14827: checking for max_fail_percentage 37031 1727204390.14828: done checking for max_fail_percentage 37031 1727204390.14829: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.14830: done checking to see if all hosts have failed 37031 1727204390.14831: getting the remaining hosts for this loop 37031 1727204390.14833: done getting the remaining hosts for this loop 37031 1727204390.14838: getting the next task for host managed-node2 37031 1727204390.14844: done getting next task for host managed-node2 37031 1727204390.14848: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 37031 1727204390.14851: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.14866: getting variables 37031 1727204390.14868: in VariableManager get_vars() 37031 1727204390.14910: Calling all_inventory to load vars for managed-node2 37031 1727204390.14912: Calling groups_inventory to load vars for managed-node2 37031 1727204390.14914: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.14923: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.14925: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.14927: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.17080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.19191: done with get_vars() 37031 1727204390.19221: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 37031 1727204390.19309: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.191) 0:00:12.738 ***** 37031 1727204390.19342: entering _queue_task() for managed-node2/yum 37031 1727204390.19344: Creating lock for yum 37031 1727204390.19677: worker is 1 (out of 1 available) 37031 1727204390.19690: exiting _queue_task() for managed-node2/yum 37031 1727204390.19704: done queuing things up, now waiting for results queue to drain 37031 1727204390.19705: waiting for pending results... 37031 1727204390.20471: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 37031 1727204390.20613: in run() - task 0affcd87-79f5-b754-dfb8-00000000001d 37031 1727204390.20632: variable 'ansible_search_path' from source: unknown 37031 1727204390.20639: variable 'ansible_search_path' from source: unknown 37031 1727204390.20685: calling self._execute() 37031 1727204390.20778: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.20789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.20802: variable 'omit' from source: magic vars 37031 1727204390.21189: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.21208: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.21392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204390.23893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204390.23986: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204390.24033: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204390.24083: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204390.24116: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204390.24210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.24244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.24281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.24332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.24353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.25473: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.25502: Evaluated conditional (ansible_distribution_major_version | int < 8): False 37031 1727204390.25516: when evaluation is False, skipping this task 37031 1727204390.25526: _execute() done 37031 1727204390.25534: dumping result to json 37031 1727204390.25543: done dumping result, returning 37031 1727204390.25559: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-00000000001d] 37031 1727204390.25573: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 37031 1727204390.25816: no more pending results, returning what we have 37031 1727204390.25822: results queue empty 37031 1727204390.25823: checking for any_errors_fatal 37031 1727204390.25829: done checking for any_errors_fatal 37031 1727204390.25829: checking for max_fail_percentage 37031 1727204390.25831: done checking for max_fail_percentage 37031 1727204390.25832: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.25833: done checking to see if all hosts have failed 37031 1727204390.25834: getting the remaining hosts for this loop 37031 1727204390.25836: done getting the remaining hosts for this loop 37031 1727204390.25840: getting the next task for host managed-node2 37031 1727204390.25847: done getting next task for host managed-node2 37031 1727204390.25852: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 37031 1727204390.25857: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.25874: getting variables 37031 1727204390.25876: in VariableManager get_vars() 37031 1727204390.25920: Calling all_inventory to load vars for managed-node2 37031 1727204390.25923: Calling groups_inventory to load vars for managed-node2 37031 1727204390.25925: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.25936: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.25939: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.25942: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.26863: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001d 37031 1727204390.26870: WORKER PROCESS EXITING 37031 1727204390.28358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.30847: done with get_vars() 37031 1727204390.30897: done getting variables 37031 1727204390.30967: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.116) 0:00:12.854 ***** 37031 1727204390.31003: entering _queue_task() for managed-node2/fail 37031 1727204390.31350: worker is 1 (out of 1 available) 37031 1727204390.31367: exiting _queue_task() for managed-node2/fail 37031 1727204390.31381: done queuing things up, now waiting for results queue to drain 37031 1727204390.31382: waiting for pending results... 37031 1727204390.31793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 37031 1727204390.31942: in run() - task 0affcd87-79f5-b754-dfb8-00000000001e 37031 1727204390.31971: variable 'ansible_search_path' from source: unknown 37031 1727204390.31983: variable 'ansible_search_path' from source: unknown 37031 1727204390.32026: calling self._execute() 37031 1727204390.32122: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.32135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.32150: variable 'omit' from source: magic vars 37031 1727204390.32537: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.32557: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.32685: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204390.32899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204390.35370: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204390.35458: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204390.35502: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204390.35538: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204390.35576: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204390.35659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.35697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.35726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.35788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.35811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.35866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.35899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.35929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.35979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.36003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.36049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.36083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.36118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.36168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.36188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.36385: variable 'network_connections' from source: task vars 37031 1727204390.36401: variable 'interface' from source: play vars 37031 1727204390.36489: variable 'interface' from source: play vars 37031 1727204390.36575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204390.36741: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204390.36785: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204390.36833: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204390.36875: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204390.36919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204390.36945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204390.36981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.37009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204390.37078: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204390.37323: variable 'network_connections' from source: task vars 37031 1727204390.37334: variable 'interface' from source: play vars 37031 1727204390.37406: variable 'interface' from source: play vars 37031 1727204390.37446: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204390.37457: when evaluation is False, skipping this task 37031 1727204390.37466: _execute() done 37031 1727204390.37473: dumping result to json 37031 1727204390.37480: done dumping result, returning 37031 1727204390.37490: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-00000000001e] 37031 1727204390.37498: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001e 37031 1727204390.37621: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001e skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204390.37678: no more pending results, returning what we have 37031 1727204390.37682: results queue empty 37031 1727204390.37683: checking for any_errors_fatal 37031 1727204390.37689: done checking for any_errors_fatal 37031 1727204390.37690: checking for max_fail_percentage 37031 1727204390.37692: done checking for max_fail_percentage 37031 1727204390.37693: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.37694: done checking to see if all hosts have failed 37031 1727204390.37695: getting the remaining hosts for this loop 37031 1727204390.37696: done getting the remaining hosts for this loop 37031 1727204390.37701: getting the next task for host managed-node2 37031 1727204390.37708: done getting next task for host managed-node2 37031 1727204390.37713: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 37031 1727204390.37716: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.37731: getting variables 37031 1727204390.37733: in VariableManager get_vars() 37031 1727204390.37781: Calling all_inventory to load vars for managed-node2 37031 1727204390.37784: Calling groups_inventory to load vars for managed-node2 37031 1727204390.37786: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.37796: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.37798: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.37801: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.38782: WORKER PROCESS EXITING 37031 1727204390.39522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.41139: done with get_vars() 37031 1727204390.41173: done getting variables 37031 1727204390.41239: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.102) 0:00:12.957 ***** 37031 1727204390.41280: entering _queue_task() for managed-node2/package 37031 1727204390.41599: worker is 1 (out of 1 available) 37031 1727204390.41611: exiting _queue_task() for managed-node2/package 37031 1727204390.41625: done queuing things up, now waiting for results queue to drain 37031 1727204390.41626: waiting for pending results... 37031 1727204390.41926: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 37031 1727204390.42079: in run() - task 0affcd87-79f5-b754-dfb8-00000000001f 37031 1727204390.42098: variable 'ansible_search_path' from source: unknown 37031 1727204390.42105: variable 'ansible_search_path' from source: unknown 37031 1727204390.42146: calling self._execute() 37031 1727204390.42237: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.42248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.42265: variable 'omit' from source: magic vars 37031 1727204390.42637: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.42655: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.42853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204390.43126: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204390.43180: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204390.43216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204390.43258: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204390.43376: variable 'network_packages' from source: role '' defaults 37031 1727204390.43492: variable '__network_provider_setup' from source: role '' defaults 37031 1727204390.43507: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204390.43585: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204390.43601: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204390.43670: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204390.43865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204390.46848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204390.46952: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204390.47018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204390.47094: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204390.47128: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204390.47317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.47369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.47403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.47458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.47482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.47530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.47569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.47599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.47646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.47675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.47922: variable '__network_packages_default_gobject_packages' from source: role '' defaults 37031 1727204390.48048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.48083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.48118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.48166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.48187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.48292: variable 'ansible_python' from source: facts 37031 1727204390.48329: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 37031 1727204390.48431: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204390.48529: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204390.48674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.48702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.48730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.48783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.48801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.48851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.48895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.48926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.49025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.49042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.49645: variable 'network_connections' from source: task vars 37031 1727204390.49659: variable 'interface' from source: play vars 37031 1727204390.49771: variable 'interface' from source: play vars 37031 1727204390.49851: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204390.49889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204390.49922: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.49963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204390.50014: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204390.50295: variable 'network_connections' from source: task vars 37031 1727204390.50304: variable 'interface' from source: play vars 37031 1727204390.50402: variable 'interface' from source: play vars 37031 1727204390.50462: variable '__network_packages_default_wireless' from source: role '' defaults 37031 1727204390.50548: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204390.51366: variable 'network_connections' from source: task vars 37031 1727204390.51376: variable 'interface' from source: play vars 37031 1727204390.51441: variable 'interface' from source: play vars 37031 1727204390.51481: variable '__network_packages_default_team' from source: role '' defaults 37031 1727204390.51650: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204390.52217: variable 'network_connections' from source: task vars 37031 1727204390.52340: variable 'interface' from source: play vars 37031 1727204390.52411: variable 'interface' from source: play vars 37031 1727204390.52615: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204390.52798: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204390.52809: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204390.52877: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204390.53438: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 37031 1727204390.54389: variable 'network_connections' from source: task vars 37031 1727204390.54402: variable 'interface' from source: play vars 37031 1727204390.54585: variable 'interface' from source: play vars 37031 1727204390.54602: variable 'ansible_distribution' from source: facts 37031 1727204390.54611: variable '__network_rh_distros' from source: role '' defaults 37031 1727204390.54627: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.54659: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 37031 1727204390.55022: variable 'ansible_distribution' from source: facts 37031 1727204390.55070: variable '__network_rh_distros' from source: role '' defaults 37031 1727204390.55081: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.55186: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 37031 1727204390.55476: variable 'ansible_distribution' from source: facts 37031 1727204390.55504: variable '__network_rh_distros' from source: role '' defaults 37031 1727204390.55514: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.55647: variable 'network_provider' from source: set_fact 37031 1727204390.55676: variable 'ansible_facts' from source: unknown 37031 1727204390.57187: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 37031 1727204390.57198: when evaluation is False, skipping this task 37031 1727204390.57205: _execute() done 37031 1727204390.57212: dumping result to json 37031 1727204390.57219: done dumping result, returning 37031 1727204390.57238: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-b754-dfb8-00000000001f] 37031 1727204390.57347: sending task result for task 0affcd87-79f5-b754-dfb8-00000000001f skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 37031 1727204390.57515: no more pending results, returning what we have 37031 1727204390.57520: results queue empty 37031 1727204390.57521: checking for any_errors_fatal 37031 1727204390.57528: done checking for any_errors_fatal 37031 1727204390.57528: checking for max_fail_percentage 37031 1727204390.57530: done checking for max_fail_percentage 37031 1727204390.57531: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.57532: done checking to see if all hosts have failed 37031 1727204390.57533: getting the remaining hosts for this loop 37031 1727204390.57535: done getting the remaining hosts for this loop 37031 1727204390.57539: getting the next task for host managed-node2 37031 1727204390.57547: done getting next task for host managed-node2 37031 1727204390.57555: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 37031 1727204390.57558: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.57575: getting variables 37031 1727204390.57578: in VariableManager get_vars() 37031 1727204390.57623: Calling all_inventory to load vars for managed-node2 37031 1727204390.57626: Calling groups_inventory to load vars for managed-node2 37031 1727204390.57629: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.57642: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.57644: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.57647: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.58686: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000001f 37031 1727204390.58691: WORKER PROCESS EXITING 37031 1727204390.60635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.64191: done with get_vars() 37031 1727204390.64227: done getting variables 37031 1727204390.64298: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.230) 0:00:13.188 ***** 37031 1727204390.64336: entering _queue_task() for managed-node2/package 37031 1727204390.65779: worker is 1 (out of 1 available) 37031 1727204390.65789: exiting _queue_task() for managed-node2/package 37031 1727204390.65800: done queuing things up, now waiting for results queue to drain 37031 1727204390.65802: waiting for pending results... 37031 1727204390.65821: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 37031 1727204390.66033: in run() - task 0affcd87-79f5-b754-dfb8-000000000020 37031 1727204390.66046: variable 'ansible_search_path' from source: unknown 37031 1727204390.66049: variable 'ansible_search_path' from source: unknown 37031 1727204390.66202: calling self._execute() 37031 1727204390.66289: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.66293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.66420: variable 'omit' from source: magic vars 37031 1727204390.67116: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.67129: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.67361: variable 'network_state' from source: role '' defaults 37031 1727204390.67373: Evaluated conditional (network_state != {}): False 37031 1727204390.67376: when evaluation is False, skipping this task 37031 1727204390.67496: _execute() done 37031 1727204390.67500: dumping result to json 37031 1727204390.67502: done dumping result, returning 37031 1727204390.67510: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-b754-dfb8-000000000020] 37031 1727204390.67514: sending task result for task 0affcd87-79f5-b754-dfb8-000000000020 37031 1727204390.67620: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000020 37031 1727204390.67623: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204390.67679: no more pending results, returning what we have 37031 1727204390.67684: results queue empty 37031 1727204390.67685: checking for any_errors_fatal 37031 1727204390.67691: done checking for any_errors_fatal 37031 1727204390.67692: checking for max_fail_percentage 37031 1727204390.67694: done checking for max_fail_percentage 37031 1727204390.67695: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.67696: done checking to see if all hosts have failed 37031 1727204390.67697: getting the remaining hosts for this loop 37031 1727204390.67699: done getting the remaining hosts for this loop 37031 1727204390.67703: getting the next task for host managed-node2 37031 1727204390.67709: done getting next task for host managed-node2 37031 1727204390.67713: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 37031 1727204390.67717: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.67732: getting variables 37031 1727204390.67734: in VariableManager get_vars() 37031 1727204390.67779: Calling all_inventory to load vars for managed-node2 37031 1727204390.67782: Calling groups_inventory to load vars for managed-node2 37031 1727204390.67784: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.67795: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.67797: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.67800: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.70401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.73978: done with get_vars() 37031 1727204390.74018: done getting variables 37031 1727204390.74092: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.097) 0:00:13.286 ***** 37031 1727204390.74129: entering _queue_task() for managed-node2/package 37031 1727204390.74468: worker is 1 (out of 1 available) 37031 1727204390.74481: exiting _queue_task() for managed-node2/package 37031 1727204390.74494: done queuing things up, now waiting for results queue to drain 37031 1727204390.74495: waiting for pending results... 37031 1727204390.75482: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 37031 1727204390.75737: in run() - task 0affcd87-79f5-b754-dfb8-000000000021 37031 1727204390.75838: variable 'ansible_search_path' from source: unknown 37031 1727204390.75843: variable 'ansible_search_path' from source: unknown 37031 1727204390.75907: calling self._execute() 37031 1727204390.76102: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.76106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.76116: variable 'omit' from source: magic vars 37031 1727204390.76971: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.76983: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.77216: variable 'network_state' from source: role '' defaults 37031 1727204390.77226: Evaluated conditional (network_state != {}): False 37031 1727204390.77229: when evaluation is False, skipping this task 37031 1727204390.77232: _execute() done 37031 1727204390.77235: dumping result to json 37031 1727204390.77240: done dumping result, returning 37031 1727204390.77246: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-b754-dfb8-000000000021] 37031 1727204390.77251: sending task result for task 0affcd87-79f5-b754-dfb8-000000000021 37031 1727204390.77465: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000021 37031 1727204390.77469: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204390.77517: no more pending results, returning what we have 37031 1727204390.77521: results queue empty 37031 1727204390.77522: checking for any_errors_fatal 37031 1727204390.77527: done checking for any_errors_fatal 37031 1727204390.77528: checking for max_fail_percentage 37031 1727204390.77530: done checking for max_fail_percentage 37031 1727204390.77531: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.77532: done checking to see if all hosts have failed 37031 1727204390.77533: getting the remaining hosts for this loop 37031 1727204390.77534: done getting the remaining hosts for this loop 37031 1727204390.77538: getting the next task for host managed-node2 37031 1727204390.77545: done getting next task for host managed-node2 37031 1727204390.77549: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 37031 1727204390.77552: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.77572: getting variables 37031 1727204390.77574: in VariableManager get_vars() 37031 1727204390.77617: Calling all_inventory to load vars for managed-node2 37031 1727204390.77620: Calling groups_inventory to load vars for managed-node2 37031 1727204390.77622: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.77634: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.77637: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.77641: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.84499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.86862: done with get_vars() 37031 1727204390.86895: done getting variables 37031 1727204390.87109: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.130) 0:00:13.416 ***** 37031 1727204390.87143: entering _queue_task() for managed-node2/service 37031 1727204390.87145: Creating lock for service 37031 1727204390.87872: worker is 1 (out of 1 available) 37031 1727204390.87885: exiting _queue_task() for managed-node2/service 37031 1727204390.87898: done queuing things up, now waiting for results queue to drain 37031 1727204390.87900: waiting for pending results... 37031 1727204390.88204: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 37031 1727204390.88345: in run() - task 0affcd87-79f5-b754-dfb8-000000000022 37031 1727204390.88382: variable 'ansible_search_path' from source: unknown 37031 1727204390.88388: variable 'ansible_search_path' from source: unknown 37031 1727204390.88408: calling self._execute() 37031 1727204390.88486: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.88490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.88500: variable 'omit' from source: magic vars 37031 1727204390.88795: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.88807: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.88889: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204390.89027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204390.92519: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204390.92610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204390.92656: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204390.92698: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204390.92729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204390.92815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.92851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.92885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.92931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.92954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.93006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.93034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.93067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.93111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.93131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.93179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204390.93208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204390.93238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.93284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204390.93302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204390.93508: variable 'network_connections' from source: task vars 37031 1727204390.93520: variable 'interface' from source: play vars 37031 1727204390.93582: variable 'interface' from source: play vars 37031 1727204390.93646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204390.93780: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204390.93823: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204390.93845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204390.93873: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204390.94287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204390.94290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204390.94293: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204390.94295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204390.94298: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204390.94319: variable 'network_connections' from source: task vars 37031 1727204390.94325: variable 'interface' from source: play vars 37031 1727204390.94398: variable 'interface' from source: play vars 37031 1727204390.94439: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204390.94443: when evaluation is False, skipping this task 37031 1727204390.94445: _execute() done 37031 1727204390.94447: dumping result to json 37031 1727204390.94449: done dumping result, returning 37031 1727204390.94454: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-000000000022] 37031 1727204390.94462: sending task result for task 0affcd87-79f5-b754-dfb8-000000000022 37031 1727204390.94568: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000022 37031 1727204390.94577: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204390.94635: no more pending results, returning what we have 37031 1727204390.94639: results queue empty 37031 1727204390.94640: checking for any_errors_fatal 37031 1727204390.94647: done checking for any_errors_fatal 37031 1727204390.94648: checking for max_fail_percentage 37031 1727204390.94650: done checking for max_fail_percentage 37031 1727204390.94650: checking to see if all hosts have failed and the running result is not ok 37031 1727204390.94651: done checking to see if all hosts have failed 37031 1727204390.94652: getting the remaining hosts for this loop 37031 1727204390.94654: done getting the remaining hosts for this loop 37031 1727204390.94658: getting the next task for host managed-node2 37031 1727204390.94666: done getting next task for host managed-node2 37031 1727204390.94670: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 37031 1727204390.94673: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204390.94691: getting variables 37031 1727204390.94693: in VariableManager get_vars() 37031 1727204390.94739: Calling all_inventory to load vars for managed-node2 37031 1727204390.94743: Calling groups_inventory to load vars for managed-node2 37031 1727204390.94745: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204390.94753: Calling all_plugins_play to load vars for managed-node2 37031 1727204390.94756: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204390.94758: Calling groups_plugins_play to load vars for managed-node2 37031 1727204390.96858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204390.97968: done with get_vars() 37031 1727204390.97988: done getting variables 37031 1727204390.98033: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:59:50 -0400 (0:00:00.109) 0:00:13.525 ***** 37031 1727204390.98059: entering _queue_task() for managed-node2/service 37031 1727204390.98289: worker is 1 (out of 1 available) 37031 1727204390.98303: exiting _queue_task() for managed-node2/service 37031 1727204390.98316: done queuing things up, now waiting for results queue to drain 37031 1727204390.98318: waiting for pending results... 37031 1727204390.98593: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 37031 1727204390.98733: in run() - task 0affcd87-79f5-b754-dfb8-000000000023 37031 1727204390.98753: variable 'ansible_search_path' from source: unknown 37031 1727204390.98761: variable 'ansible_search_path' from source: unknown 37031 1727204390.98804: calling self._execute() 37031 1727204390.98910: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204390.98920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204390.98932: variable 'omit' from source: magic vars 37031 1727204390.99329: variable 'ansible_distribution_major_version' from source: facts 37031 1727204390.99348: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204390.99531: variable 'network_provider' from source: set_fact 37031 1727204390.99541: variable 'network_state' from source: role '' defaults 37031 1727204390.99561: Evaluated conditional (network_provider == "nm" or network_state != {}): True 37031 1727204390.99573: variable 'omit' from source: magic vars 37031 1727204390.99627: variable 'omit' from source: magic vars 37031 1727204390.99674: variable 'network_service_name' from source: role '' defaults 37031 1727204390.99755: variable 'network_service_name' from source: role '' defaults 37031 1727204390.99877: variable '__network_provider_setup' from source: role '' defaults 37031 1727204390.99895: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204390.99961: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204390.99985: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204391.00055: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204391.00328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204391.02600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204391.02659: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204391.02686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204391.02713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204391.02736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204391.02795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.02818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.02838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.02867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.02879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.02915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.02929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.02947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.02976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.02987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.03139: variable '__network_packages_default_gobject_packages' from source: role '' defaults 37031 1727204391.03221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.03241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.03259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.03287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.03297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.03365: variable 'ansible_python' from source: facts 37031 1727204391.03385: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 37031 1727204391.03441: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204391.03501: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204391.03584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.03603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.03620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.03645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.03658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.03694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.03714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.03732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.03759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.03769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.03867: variable 'network_connections' from source: task vars 37031 1727204391.03879: variable 'interface' from source: play vars 37031 1727204391.03973: variable 'interface' from source: play vars 37031 1727204391.04096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204391.04328: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204391.04393: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204391.04440: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204391.04496: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204391.04570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204391.04608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204391.04643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.04700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204391.04756: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204391.05101: variable 'network_connections' from source: task vars 37031 1727204391.05119: variable 'interface' from source: play vars 37031 1727204391.05196: variable 'interface' from source: play vars 37031 1727204391.05238: variable '__network_packages_default_wireless' from source: role '' defaults 37031 1727204391.05302: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204391.05495: variable 'network_connections' from source: task vars 37031 1727204391.05498: variable 'interface' from source: play vars 37031 1727204391.05550: variable 'interface' from source: play vars 37031 1727204391.05574: variable '__network_packages_default_team' from source: role '' defaults 37031 1727204391.05628: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204391.05818: variable 'network_connections' from source: task vars 37031 1727204391.05822: variable 'interface' from source: play vars 37031 1727204391.05876: variable 'interface' from source: play vars 37031 1727204391.05920: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204391.05968: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204391.05973: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204391.06013: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204391.06152: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 37031 1727204391.06486: variable 'network_connections' from source: task vars 37031 1727204391.06489: variable 'interface' from source: play vars 37031 1727204391.06534: variable 'interface' from source: play vars 37031 1727204391.06543: variable 'ansible_distribution' from source: facts 37031 1727204391.06545: variable '__network_rh_distros' from source: role '' defaults 37031 1727204391.06552: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.06576: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 37031 1727204391.06692: variable 'ansible_distribution' from source: facts 37031 1727204391.06695: variable '__network_rh_distros' from source: role '' defaults 37031 1727204391.06699: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.06710: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 37031 1727204391.06827: variable 'ansible_distribution' from source: facts 37031 1727204391.06830: variable '__network_rh_distros' from source: role '' defaults 37031 1727204391.06833: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.06866: variable 'network_provider' from source: set_fact 37031 1727204391.06884: variable 'omit' from source: magic vars 37031 1727204391.06908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204391.06928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204391.06945: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204391.06961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204391.06972: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204391.06999: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204391.07002: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204391.07004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204391.07071: Set connection var ansible_connection to ssh 37031 1727204391.07074: Set connection var ansible_shell_type to sh 37031 1727204391.07080: Set connection var ansible_pipelining to False 37031 1727204391.07087: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204391.07092: Set connection var ansible_timeout to 10 37031 1727204391.07097: Set connection var ansible_shell_executable to /bin/sh 37031 1727204391.07120: variable 'ansible_shell_executable' from source: unknown 37031 1727204391.07123: variable 'ansible_connection' from source: unknown 37031 1727204391.07125: variable 'ansible_module_compression' from source: unknown 37031 1727204391.07127: variable 'ansible_shell_type' from source: unknown 37031 1727204391.07130: variable 'ansible_shell_executable' from source: unknown 37031 1727204391.07132: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204391.07136: variable 'ansible_pipelining' from source: unknown 37031 1727204391.07138: variable 'ansible_timeout' from source: unknown 37031 1727204391.07142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204391.07220: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204391.07229: variable 'omit' from source: magic vars 37031 1727204391.07238: starting attempt loop 37031 1727204391.07241: running the handler 37031 1727204391.07298: variable 'ansible_facts' from source: unknown 37031 1727204391.07786: _low_level_execute_command(): starting 37031 1727204391.07791: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204391.08322: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.08331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.08362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.08378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.08389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.08440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.08451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.08517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.10181: stdout chunk (state=3): >>>/root <<< 37031 1727204391.10282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204391.10343: stderr chunk (state=3): >>><<< 37031 1727204391.10347: stdout chunk (state=3): >>><<< 37031 1727204391.10369: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204391.10380: _low_level_execute_command(): starting 37031 1727204391.10386: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776 `" && echo ansible-tmp-1727204391.1037061-37961-37355822057776="` echo /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776 `" ) && sleep 0' 37031 1727204391.11130: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204391.11140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.11152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.11172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.11223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.11231: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204391.11243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.11260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204391.11272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204391.11279: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204391.11289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.11306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.11323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.11331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.11338: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204391.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.11436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.11455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204391.11475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.11551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.13419: stdout chunk (state=3): >>>ansible-tmp-1727204391.1037061-37961-37355822057776=/root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776 <<< 37031 1727204391.13536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204391.13594: stderr chunk (state=3): >>><<< 37031 1727204391.13597: stdout chunk (state=3): >>><<< 37031 1727204391.13612: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204391.1037061-37961-37355822057776=/root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204391.13640: variable 'ansible_module_compression' from source: unknown 37031 1727204391.13691: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 37031 1727204391.13695: ANSIBALLZ: Acquiring lock 37031 1727204391.13698: ANSIBALLZ: Lock acquired: 140694173153808 37031 1727204391.13700: ANSIBALLZ: Creating module 37031 1727204391.39147: ANSIBALLZ: Writing module into payload 37031 1727204391.39387: ANSIBALLZ: Writing module 37031 1727204391.39432: ANSIBALLZ: Renaming module 37031 1727204391.39443: ANSIBALLZ: Done creating module 37031 1727204391.39492: variable 'ansible_facts' from source: unknown 37031 1727204391.39684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/AnsiballZ_systemd.py 37031 1727204391.39853: Sending initial data 37031 1727204391.39859: Sent initial data (155 bytes) 37031 1727204391.40866: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204391.40883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.40897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.40916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.40963: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.40987: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204391.41003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.41021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204391.41032: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204391.41042: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204391.41052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.41070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.41085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.41096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.41106: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204391.41119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.41202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.41218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204391.41234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.41309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.43146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204391.43223: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204391.43228: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpb9ue083_ /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/AnsiballZ_systemd.py <<< 37031 1727204391.43271: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204391.45875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204391.46044: stderr chunk (state=3): >>><<< 37031 1727204391.46048: stdout chunk (state=3): >>><<< 37031 1727204391.46050: done transferring module to remote 37031 1727204391.46052: _low_level_execute_command(): starting 37031 1727204391.46057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/ /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/AnsiballZ_systemd.py && sleep 0' 37031 1727204391.46681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204391.46704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.46725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.46744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.46799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.46817: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204391.46837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.46859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204391.46876: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204391.46888: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204391.46900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.46914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.46938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.46951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.46967: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204391.46982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.47074: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.47093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204391.47109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.47189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.48915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204391.48995: stderr chunk (state=3): >>><<< 37031 1727204391.48999: stdout chunk (state=3): >>><<< 37031 1727204391.49016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204391.49020: _low_level_execute_command(): starting 37031 1727204391.49027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/AnsiballZ_systemd.py && sleep 0' 37031 1727204391.49716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204391.49722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.49735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.49750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.49794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.49800: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204391.49811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.49824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204391.49830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204391.49837: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204391.49844: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.49852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.49865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.49875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.49881: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204391.49890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.49969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.49973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204391.49980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.50062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.75146: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManag<<< 37031 1727204391.75197: stdout chunk (state=3): >>>er.service", "ControlGroupId": "3602", "MemoryCurrent": "6983680", "MemoryAvailable": "infinity", "CPUUsageNSec": "1779008000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdo<<< 37031 1727204391.75206: stdout chunk (state=3): >>>gSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 37031 1727204391.76715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204391.76719: stderr chunk (state=3): >>><<< 37031 1727204391.76724: stdout chunk (state=3): >>><<< 37031 1727204391.76747: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6983680", "MemoryAvailable": "infinity", "CPUUsageNSec": "1779008000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204391.76936: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204391.76955: _low_level_execute_command(): starting 37031 1727204391.76965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204391.1037061-37961-37355822057776/ > /dev/null 2>&1 && sleep 0' 37031 1727204391.77697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204391.77705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.77716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.77730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.77783: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.77790: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204391.77800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.77812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204391.77820: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204391.77826: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204391.77834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204391.77844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204391.77866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204391.77873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204391.77880: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204391.77889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204391.77963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204391.77991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204391.78003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204391.78071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204391.79919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204391.79923: stdout chunk (state=3): >>><<< 37031 1727204391.79928: stderr chunk (state=3): >>><<< 37031 1727204391.79948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204391.79957: handler run complete 37031 1727204391.80008: attempt loop complete, returning result 37031 1727204391.80012: _execute() done 37031 1727204391.80014: dumping result to json 37031 1727204391.80027: done dumping result, returning 37031 1727204391.80037: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-b754-dfb8-000000000023] 37031 1727204391.80043: sending task result for task 0affcd87-79f5-b754-dfb8-000000000023 37031 1727204391.80319: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000023 37031 1727204391.80321: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204391.80421: no more pending results, returning what we have 37031 1727204391.80426: results queue empty 37031 1727204391.80427: checking for any_errors_fatal 37031 1727204391.80436: done checking for any_errors_fatal 37031 1727204391.80437: checking for max_fail_percentage 37031 1727204391.80438: done checking for max_fail_percentage 37031 1727204391.80439: checking to see if all hosts have failed and the running result is not ok 37031 1727204391.80440: done checking to see if all hosts have failed 37031 1727204391.80441: getting the remaining hosts for this loop 37031 1727204391.80443: done getting the remaining hosts for this loop 37031 1727204391.80447: getting the next task for host managed-node2 37031 1727204391.80453: done getting next task for host managed-node2 37031 1727204391.80458: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 37031 1727204391.80461: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204391.80478: getting variables 37031 1727204391.80480: in VariableManager get_vars() 37031 1727204391.80524: Calling all_inventory to load vars for managed-node2 37031 1727204391.80527: Calling groups_inventory to load vars for managed-node2 37031 1727204391.80530: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204391.80540: Calling all_plugins_play to load vars for managed-node2 37031 1727204391.80543: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204391.80546: Calling groups_plugins_play to load vars for managed-node2 37031 1727204391.82206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204391.83748: done with get_vars() 37031 1727204391.83772: done getting variables 37031 1727204391.83817: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:59:51 -0400 (0:00:00.857) 0:00:14.383 ***** 37031 1727204391.83850: entering _queue_task() for managed-node2/service 37031 1727204391.84080: worker is 1 (out of 1 available) 37031 1727204391.84094: exiting _queue_task() for managed-node2/service 37031 1727204391.84108: done queuing things up, now waiting for results queue to drain 37031 1727204391.84109: waiting for pending results... 37031 1727204391.84301: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 37031 1727204391.84395: in run() - task 0affcd87-79f5-b754-dfb8-000000000024 37031 1727204391.84406: variable 'ansible_search_path' from source: unknown 37031 1727204391.84410: variable 'ansible_search_path' from source: unknown 37031 1727204391.84441: calling self._execute() 37031 1727204391.84518: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204391.84522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204391.84529: variable 'omit' from source: magic vars 37031 1727204391.84815: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.84824: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204391.84907: variable 'network_provider' from source: set_fact 37031 1727204391.84912: Evaluated conditional (network_provider == "nm"): True 37031 1727204391.84981: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204391.85044: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204391.85251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204391.87467: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204391.87515: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204391.87542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204391.87572: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204391.87593: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204391.87655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.87681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.87698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.87728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.87738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.87776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.87792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.87809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.87838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.87848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.87880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204391.87899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204391.87915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.87944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204391.87957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204391.88059: variable 'network_connections' from source: task vars 37031 1727204391.88070: variable 'interface' from source: play vars 37031 1727204391.88126: variable 'interface' from source: play vars 37031 1727204391.88183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204391.88329: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204391.88355: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204391.88402: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204391.88439: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204391.88496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204391.88537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204391.88577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204391.88617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204391.88690: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204391.88981: variable 'network_connections' from source: task vars 37031 1727204391.88993: variable 'interface' from source: play vars 37031 1727204391.89071: variable 'interface' from source: play vars 37031 1727204391.89123: Evaluated conditional (__network_wpa_supplicant_required): False 37031 1727204391.89135: when evaluation is False, skipping this task 37031 1727204391.89142: _execute() done 37031 1727204391.89149: dumping result to json 37031 1727204391.89156: done dumping result, returning 37031 1727204391.89169: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-b754-dfb8-000000000024] 37031 1727204391.89191: sending task result for task 0affcd87-79f5-b754-dfb8-000000000024 37031 1727204391.89306: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000024 37031 1727204391.89314: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 37031 1727204391.89371: no more pending results, returning what we have 37031 1727204391.89375: results queue empty 37031 1727204391.89375: checking for any_errors_fatal 37031 1727204391.89394: done checking for any_errors_fatal 37031 1727204391.89395: checking for max_fail_percentage 37031 1727204391.89397: done checking for max_fail_percentage 37031 1727204391.89398: checking to see if all hosts have failed and the running result is not ok 37031 1727204391.89399: done checking to see if all hosts have failed 37031 1727204391.89399: getting the remaining hosts for this loop 37031 1727204391.89401: done getting the remaining hosts for this loop 37031 1727204391.89406: getting the next task for host managed-node2 37031 1727204391.89413: done getting next task for host managed-node2 37031 1727204391.89417: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 37031 1727204391.89419: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204391.89434: getting variables 37031 1727204391.89436: in VariableManager get_vars() 37031 1727204391.89480: Calling all_inventory to load vars for managed-node2 37031 1727204391.89482: Calling groups_inventory to load vars for managed-node2 37031 1727204391.89485: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204391.89494: Calling all_plugins_play to load vars for managed-node2 37031 1727204391.89496: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204391.89499: Calling groups_plugins_play to load vars for managed-node2 37031 1727204391.90878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204391.91794: done with get_vars() 37031 1727204391.91812: done getting variables 37031 1727204391.91859: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:59:51 -0400 (0:00:00.080) 0:00:14.463 ***** 37031 1727204391.91884: entering _queue_task() for managed-node2/service 37031 1727204391.92115: worker is 1 (out of 1 available) 37031 1727204391.92152: exiting _queue_task() for managed-node2/service 37031 1727204391.92187: done queuing things up, now waiting for results queue to drain 37031 1727204391.92189: waiting for pending results... 37031 1727204391.92462: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 37031 1727204391.92615: in run() - task 0affcd87-79f5-b754-dfb8-000000000025 37031 1727204391.92634: variable 'ansible_search_path' from source: unknown 37031 1727204391.92641: variable 'ansible_search_path' from source: unknown 37031 1727204391.92685: calling self._execute() 37031 1727204391.92798: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204391.92811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204391.92828: variable 'omit' from source: magic vars 37031 1727204391.93225: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.93248: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204391.93361: variable 'network_provider' from source: set_fact 37031 1727204391.93374: Evaluated conditional (network_provider == "initscripts"): False 37031 1727204391.93380: when evaluation is False, skipping this task 37031 1727204391.93387: _execute() done 37031 1727204391.93393: dumping result to json 37031 1727204391.93399: done dumping result, returning 37031 1727204391.93407: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-b754-dfb8-000000000025] 37031 1727204391.93414: sending task result for task 0affcd87-79f5-b754-dfb8-000000000025 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204391.93563: no more pending results, returning what we have 37031 1727204391.93568: results queue empty 37031 1727204391.93569: checking for any_errors_fatal 37031 1727204391.93580: done checking for any_errors_fatal 37031 1727204391.93581: checking for max_fail_percentage 37031 1727204391.93583: done checking for max_fail_percentage 37031 1727204391.93584: checking to see if all hosts have failed and the running result is not ok 37031 1727204391.93585: done checking to see if all hosts have failed 37031 1727204391.93586: getting the remaining hosts for this loop 37031 1727204391.93588: done getting the remaining hosts for this loop 37031 1727204391.93592: getting the next task for host managed-node2 37031 1727204391.93599: done getting next task for host managed-node2 37031 1727204391.93603: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 37031 1727204391.93607: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204391.93623: getting variables 37031 1727204391.93625: in VariableManager get_vars() 37031 1727204391.93675: Calling all_inventory to load vars for managed-node2 37031 1727204391.93679: Calling groups_inventory to load vars for managed-node2 37031 1727204391.93681: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204391.93696: Calling all_plugins_play to load vars for managed-node2 37031 1727204391.93698: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204391.93701: Calling groups_plugins_play to load vars for managed-node2 37031 1727204391.95128: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000025 37031 1727204391.95132: WORKER PROCESS EXITING 37031 1727204391.95142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204391.96193: done with get_vars() 37031 1727204391.96209: done getting variables 37031 1727204391.96258: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:59:51 -0400 (0:00:00.044) 0:00:14.507 ***** 37031 1727204391.96287: entering _queue_task() for managed-node2/copy 37031 1727204391.96598: worker is 1 (out of 1 available) 37031 1727204391.96612: exiting _queue_task() for managed-node2/copy 37031 1727204391.96625: done queuing things up, now waiting for results queue to drain 37031 1727204391.96626: waiting for pending results... 37031 1727204391.96937: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 37031 1727204391.97087: in run() - task 0affcd87-79f5-b754-dfb8-000000000026 37031 1727204391.97106: variable 'ansible_search_path' from source: unknown 37031 1727204391.97113: variable 'ansible_search_path' from source: unknown 37031 1727204391.97153: calling self._execute() 37031 1727204391.97274: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204391.97302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204391.97316: variable 'omit' from source: magic vars 37031 1727204391.97786: variable 'ansible_distribution_major_version' from source: facts 37031 1727204391.97796: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204391.97882: variable 'network_provider' from source: set_fact 37031 1727204391.97887: Evaluated conditional (network_provider == "initscripts"): False 37031 1727204391.97890: when evaluation is False, skipping this task 37031 1727204391.97893: _execute() done 37031 1727204391.97897: dumping result to json 37031 1727204391.97900: done dumping result, returning 37031 1727204391.97907: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-b754-dfb8-000000000026] 37031 1727204391.97912: sending task result for task 0affcd87-79f5-b754-dfb8-000000000026 37031 1727204391.98009: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000026 37031 1727204391.98011: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 37031 1727204391.98058: no more pending results, returning what we have 37031 1727204391.98062: results queue empty 37031 1727204391.98063: checking for any_errors_fatal 37031 1727204391.98076: done checking for any_errors_fatal 37031 1727204391.98077: checking for max_fail_percentage 37031 1727204391.98079: done checking for max_fail_percentage 37031 1727204391.98080: checking to see if all hosts have failed and the running result is not ok 37031 1727204391.98081: done checking to see if all hosts have failed 37031 1727204391.98081: getting the remaining hosts for this loop 37031 1727204391.98083: done getting the remaining hosts for this loop 37031 1727204391.98087: getting the next task for host managed-node2 37031 1727204391.98094: done getting next task for host managed-node2 37031 1727204391.98098: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 37031 1727204391.98102: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204391.98118: getting variables 37031 1727204391.98119: in VariableManager get_vars() 37031 1727204391.98159: Calling all_inventory to load vars for managed-node2 37031 1727204391.98161: Calling groups_inventory to load vars for managed-node2 37031 1727204391.98163: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204391.98174: Calling all_plugins_play to load vars for managed-node2 37031 1727204391.98177: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204391.98179: Calling groups_plugins_play to load vars for managed-node2 37031 1727204391.98977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204391.99891: done with get_vars() 37031 1727204391.99909: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:59:51 -0400 (0:00:00.036) 0:00:14.544 ***** 37031 1727204391.99977: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 37031 1727204391.99978: Creating lock for fedora.linux_system_roles.network_connections 37031 1727204392.00216: worker is 1 (out of 1 available) 37031 1727204392.00230: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 37031 1727204392.00244: done queuing things up, now waiting for results queue to drain 37031 1727204392.00245: waiting for pending results... 37031 1727204392.00428: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 37031 1727204392.00521: in run() - task 0affcd87-79f5-b754-dfb8-000000000027 37031 1727204392.00533: variable 'ansible_search_path' from source: unknown 37031 1727204392.00537: variable 'ansible_search_path' from source: unknown 37031 1727204392.00568: calling self._execute() 37031 1727204392.00639: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204392.00643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204392.00651: variable 'omit' from source: magic vars 37031 1727204392.00931: variable 'ansible_distribution_major_version' from source: facts 37031 1727204392.00940: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204392.00948: variable 'omit' from source: magic vars 37031 1727204392.00989: variable 'omit' from source: magic vars 37031 1727204392.01107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204392.02687: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204392.02735: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204392.02767: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204392.02795: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204392.02814: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204392.02880: variable 'network_provider' from source: set_fact 37031 1727204392.02980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204392.03015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204392.03032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204392.03061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204392.03072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204392.03129: variable 'omit' from source: magic vars 37031 1727204392.03215: variable 'omit' from source: magic vars 37031 1727204392.03292: variable 'network_connections' from source: task vars 37031 1727204392.03304: variable 'interface' from source: play vars 37031 1727204392.03357: variable 'interface' from source: play vars 37031 1727204392.03480: variable 'omit' from source: magic vars 37031 1727204392.03487: variable '__lsr_ansible_managed' from source: task vars 37031 1727204392.03530: variable '__lsr_ansible_managed' from source: task vars 37031 1727204392.03987: Loaded config def from plugin (lookup/template) 37031 1727204392.03990: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 37031 1727204392.04013: File lookup term: get_ansible_managed.j2 37031 1727204392.04016: variable 'ansible_search_path' from source: unknown 37031 1727204392.04021: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 37031 1727204392.04032: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 37031 1727204392.04046: variable 'ansible_search_path' from source: unknown 37031 1727204392.08127: variable 'ansible_managed' from source: unknown 37031 1727204392.08224: variable 'omit' from source: magic vars 37031 1727204392.08248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204392.08273: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204392.08289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204392.08302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204392.08311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204392.08334: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204392.08337: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204392.08340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204392.08407: Set connection var ansible_connection to ssh 37031 1727204392.08411: Set connection var ansible_shell_type to sh 37031 1727204392.08415: Set connection var ansible_pipelining to False 37031 1727204392.08422: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204392.08430: Set connection var ansible_timeout to 10 37031 1727204392.08433: Set connection var ansible_shell_executable to /bin/sh 37031 1727204392.08454: variable 'ansible_shell_executable' from source: unknown 37031 1727204392.08458: variable 'ansible_connection' from source: unknown 37031 1727204392.08462: variable 'ansible_module_compression' from source: unknown 37031 1727204392.08471: variable 'ansible_shell_type' from source: unknown 37031 1727204392.08474: variable 'ansible_shell_executable' from source: unknown 37031 1727204392.08477: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204392.08482: variable 'ansible_pipelining' from source: unknown 37031 1727204392.08484: variable 'ansible_timeout' from source: unknown 37031 1727204392.08488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204392.08588: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204392.08599: variable 'omit' from source: magic vars 37031 1727204392.08602: starting attempt loop 37031 1727204392.08605: running the handler 37031 1727204392.08616: _low_level_execute_command(): starting 37031 1727204392.08623: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204392.09704: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204392.09732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204392.11127: stdout chunk (state=3): >>>/root <<< 37031 1727204392.11253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204392.11337: stderr chunk (state=3): >>><<< 37031 1727204392.11353: stdout chunk (state=3): >>><<< 37031 1727204392.11382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204392.11400: _low_level_execute_command(): starting 37031 1727204392.11410: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309 `" && echo ansible-tmp-1727204392.1138995-38001-28184471877309="` echo /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309 `" ) && sleep 0' 37031 1727204392.12125: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204392.12138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.12150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.12170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.12226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.12240: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204392.12255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.12276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204392.12288: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204392.12299: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204392.12315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.12333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.12349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.12361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.12375: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204392.12389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.12475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204392.12500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204392.12517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204392.12625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204392.14499: stdout chunk (state=3): >>>ansible-tmp-1727204392.1138995-38001-28184471877309=/root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309 <<< 37031 1727204392.14713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204392.14717: stdout chunk (state=3): >>><<< 37031 1727204392.14719: stderr chunk (state=3): >>><<< 37031 1727204392.14935: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204392.1138995-38001-28184471877309=/root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204392.14939: variable 'ansible_module_compression' from source: unknown 37031 1727204392.14941: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 37031 1727204392.14943: ANSIBALLZ: Acquiring lock 37031 1727204392.14946: ANSIBALLZ: Lock acquired: 140694168797152 37031 1727204392.14948: ANSIBALLZ: Creating module 37031 1727204392.45306: ANSIBALLZ: Writing module into payload 37031 1727204392.45794: ANSIBALLZ: Writing module 37031 1727204392.45833: ANSIBALLZ: Renaming module 37031 1727204392.45844: ANSIBALLZ: Done creating module 37031 1727204392.45879: variable 'ansible_facts' from source: unknown 37031 1727204392.45989: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/AnsiballZ_network_connections.py 37031 1727204392.46159: Sending initial data 37031 1727204392.46163: Sent initial data (167 bytes) 37031 1727204392.47163: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204392.47183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.47197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.47213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.47261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.47284: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204392.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.47313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204392.47324: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204392.47335: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204392.47345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.47360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.47379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.47392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.47403: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204392.47416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.47493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204392.47517: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204392.47535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204392.47617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204392.49884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204392.49970: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204392.49974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpqyw8ykmc /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/AnsiballZ_network_connections.py <<< 37031 1727204392.50303: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204392.52583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204392.52702: stderr chunk (state=3): >>><<< 37031 1727204392.52705: stdout chunk (state=3): >>><<< 37031 1727204392.52707: done transferring module to remote 37031 1727204392.52709: _low_level_execute_command(): starting 37031 1727204392.52712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/ /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/AnsiballZ_network_connections.py && sleep 0' 37031 1727204392.53539: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204392.53548: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.53561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.53576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.53627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.53635: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204392.53646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.53661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204392.53671: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204392.53678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204392.53686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.53696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.53718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.53725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.53731: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204392.53740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.53815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204392.53835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204392.53845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204392.53941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204392.56063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204392.56070: stdout chunk (state=3): >>><<< 37031 1727204392.56073: stderr chunk (state=3): >>><<< 37031 1727204392.56095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204392.56098: _low_level_execute_command(): starting 37031 1727204392.56101: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/AnsiballZ_network_connections.py && sleep 0' 37031 1727204392.56852: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204392.56874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.56894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.56915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.56962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.56982: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204392.56995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.57010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204392.57025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204392.57035: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204392.57046: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204392.57060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204392.57078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204392.57090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204392.57101: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204392.57116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204392.57199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204392.57221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204392.57238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204392.57319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204394.82094: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 37031 1727204394.84039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204394.84095: stderr chunk (state=3): >>><<< 37031 1727204394.84099: stdout chunk (state=3): >>><<< 37031 1727204394.84116: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204394.84149: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204394.84161: _low_level_execute_command(): starting 37031 1727204394.84166: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204392.1138995-38001-28184471877309/ > /dev/null 2>&1 && sleep 0' 37031 1727204394.84618: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204394.84622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204394.84650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204394.84653: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204394.84658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204394.84704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204394.84711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204394.84771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204394.86577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204394.86631: stderr chunk (state=3): >>><<< 37031 1727204394.86634: stdout chunk (state=3): >>><<< 37031 1727204394.86649: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204394.86661: handler run complete 37031 1727204394.86685: attempt loop complete, returning result 37031 1727204394.86688: _execute() done 37031 1727204394.86690: dumping result to json 37031 1727204394.86698: done dumping result, returning 37031 1727204394.86706: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-b754-dfb8-000000000027] 37031 1727204394.86710: sending task result for task 0affcd87-79f5-b754-dfb8-000000000027 37031 1727204394.86814: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000027 37031 1727204394.86817: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active) 37031 1727204394.86923: no more pending results, returning what we have 37031 1727204394.86928: results queue empty 37031 1727204394.86929: checking for any_errors_fatal 37031 1727204394.86938: done checking for any_errors_fatal 37031 1727204394.86939: checking for max_fail_percentage 37031 1727204394.86941: done checking for max_fail_percentage 37031 1727204394.86941: checking to see if all hosts have failed and the running result is not ok 37031 1727204394.86942: done checking to see if all hosts have failed 37031 1727204394.86943: getting the remaining hosts for this loop 37031 1727204394.86945: done getting the remaining hosts for this loop 37031 1727204394.86949: getting the next task for host managed-node2 37031 1727204394.86954: done getting next task for host managed-node2 37031 1727204394.86960: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 37031 1727204394.86962: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204394.86981: getting variables 37031 1727204394.86982: in VariableManager get_vars() 37031 1727204394.87022: Calling all_inventory to load vars for managed-node2 37031 1727204394.87025: Calling groups_inventory to load vars for managed-node2 37031 1727204394.87026: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204394.87035: Calling all_plugins_play to load vars for managed-node2 37031 1727204394.87037: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204394.87039: Calling groups_plugins_play to load vars for managed-node2 37031 1727204394.88036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204394.88937: done with get_vars() 37031 1727204394.88952: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:59:54 -0400 (0:00:02.890) 0:00:17.435 ***** 37031 1727204394.89015: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 37031 1727204394.89016: Creating lock for fedora.linux_system_roles.network_state 37031 1727204394.89244: worker is 1 (out of 1 available) 37031 1727204394.89260: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 37031 1727204394.89275: done queuing things up, now waiting for results queue to drain 37031 1727204394.89277: waiting for pending results... 37031 1727204394.89450: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 37031 1727204394.89538: in run() - task 0affcd87-79f5-b754-dfb8-000000000028 37031 1727204394.89550: variable 'ansible_search_path' from source: unknown 37031 1727204394.89554: variable 'ansible_search_path' from source: unknown 37031 1727204394.89589: calling self._execute() 37031 1727204394.89658: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.89662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.89671: variable 'omit' from source: magic vars 37031 1727204394.89947: variable 'ansible_distribution_major_version' from source: facts 37031 1727204394.89960: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204394.90042: variable 'network_state' from source: role '' defaults 37031 1727204394.90052: Evaluated conditional (network_state != {}): False 37031 1727204394.90058: when evaluation is False, skipping this task 37031 1727204394.90061: _execute() done 37031 1727204394.90066: dumping result to json 37031 1727204394.90068: done dumping result, returning 37031 1727204394.90071: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-b754-dfb8-000000000028] 37031 1727204394.90076: sending task result for task 0affcd87-79f5-b754-dfb8-000000000028 37031 1727204394.90168: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000028 37031 1727204394.90171: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204394.90220: no more pending results, returning what we have 37031 1727204394.90224: results queue empty 37031 1727204394.90224: checking for any_errors_fatal 37031 1727204394.90235: done checking for any_errors_fatal 37031 1727204394.90236: checking for max_fail_percentage 37031 1727204394.90237: done checking for max_fail_percentage 37031 1727204394.90238: checking to see if all hosts have failed and the running result is not ok 37031 1727204394.90239: done checking to see if all hosts have failed 37031 1727204394.90240: getting the remaining hosts for this loop 37031 1727204394.90241: done getting the remaining hosts for this loop 37031 1727204394.90245: getting the next task for host managed-node2 37031 1727204394.90251: done getting next task for host managed-node2 37031 1727204394.90255: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 37031 1727204394.90260: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204394.90275: getting variables 37031 1727204394.90277: in VariableManager get_vars() 37031 1727204394.90316: Calling all_inventory to load vars for managed-node2 37031 1727204394.90319: Calling groups_inventory to load vars for managed-node2 37031 1727204394.90321: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204394.90329: Calling all_plugins_play to load vars for managed-node2 37031 1727204394.90331: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204394.90333: Calling groups_plugins_play to load vars for managed-node2 37031 1727204394.91094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204394.92006: done with get_vars() 37031 1727204394.92022: done getting variables 37031 1727204394.92071: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:59:54 -0400 (0:00:00.030) 0:00:17.465 ***** 37031 1727204394.92094: entering _queue_task() for managed-node2/debug 37031 1727204394.92304: worker is 1 (out of 1 available) 37031 1727204394.92317: exiting _queue_task() for managed-node2/debug 37031 1727204394.92329: done queuing things up, now waiting for results queue to drain 37031 1727204394.92330: waiting for pending results... 37031 1727204394.92502: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 37031 1727204394.92595: in run() - task 0affcd87-79f5-b754-dfb8-000000000029 37031 1727204394.92608: variable 'ansible_search_path' from source: unknown 37031 1727204394.92611: variable 'ansible_search_path' from source: unknown 37031 1727204394.92639: calling self._execute() 37031 1727204394.92707: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.92712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.92719: variable 'omit' from source: magic vars 37031 1727204394.92984: variable 'ansible_distribution_major_version' from source: facts 37031 1727204394.92995: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204394.92998: variable 'omit' from source: magic vars 37031 1727204394.93038: variable 'omit' from source: magic vars 37031 1727204394.93062: variable 'omit' from source: magic vars 37031 1727204394.93096: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204394.93122: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204394.93141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204394.93154: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204394.93168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204394.93190: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204394.93193: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.93196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.93271: Set connection var ansible_connection to ssh 37031 1727204394.93274: Set connection var ansible_shell_type to sh 37031 1727204394.93280: Set connection var ansible_pipelining to False 37031 1727204394.93287: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204394.93292: Set connection var ansible_timeout to 10 37031 1727204394.93297: Set connection var ansible_shell_executable to /bin/sh 37031 1727204394.93317: variable 'ansible_shell_executable' from source: unknown 37031 1727204394.93320: variable 'ansible_connection' from source: unknown 37031 1727204394.93322: variable 'ansible_module_compression' from source: unknown 37031 1727204394.93326: variable 'ansible_shell_type' from source: unknown 37031 1727204394.93328: variable 'ansible_shell_executable' from source: unknown 37031 1727204394.93330: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.93333: variable 'ansible_pipelining' from source: unknown 37031 1727204394.93336: variable 'ansible_timeout' from source: unknown 37031 1727204394.93338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.93436: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204394.93445: variable 'omit' from source: magic vars 37031 1727204394.93455: starting attempt loop 37031 1727204394.93458: running the handler 37031 1727204394.93545: variable '__network_connections_result' from source: set_fact 37031 1727204394.93593: handler run complete 37031 1727204394.93606: attempt loop complete, returning result 37031 1727204394.93609: _execute() done 37031 1727204394.93612: dumping result to json 37031 1727204394.93614: done dumping result, returning 37031 1727204394.93621: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-b754-dfb8-000000000029] 37031 1727204394.93625: sending task result for task 0affcd87-79f5-b754-dfb8-000000000029 37031 1727204394.93709: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000029 37031 1727204394.93712: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active)" ] } 37031 1727204394.93768: no more pending results, returning what we have 37031 1727204394.93771: results queue empty 37031 1727204394.93772: checking for any_errors_fatal 37031 1727204394.93785: done checking for any_errors_fatal 37031 1727204394.93786: checking for max_fail_percentage 37031 1727204394.93787: done checking for max_fail_percentage 37031 1727204394.93788: checking to see if all hosts have failed and the running result is not ok 37031 1727204394.93789: done checking to see if all hosts have failed 37031 1727204394.93790: getting the remaining hosts for this loop 37031 1727204394.93791: done getting the remaining hosts for this loop 37031 1727204394.93795: getting the next task for host managed-node2 37031 1727204394.93800: done getting next task for host managed-node2 37031 1727204394.93804: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 37031 1727204394.93806: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204394.93817: getting variables 37031 1727204394.93818: in VariableManager get_vars() 37031 1727204394.93853: Calling all_inventory to load vars for managed-node2 37031 1727204394.93855: Calling groups_inventory to load vars for managed-node2 37031 1727204394.93857: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204394.93867: Calling all_plugins_play to load vars for managed-node2 37031 1727204394.93869: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204394.93872: Calling groups_plugins_play to load vars for managed-node2 37031 1727204394.94757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204394.95658: done with get_vars() 37031 1727204394.95676: done getting variables 37031 1727204394.95719: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:59:54 -0400 (0:00:00.036) 0:00:17.502 ***** 37031 1727204394.95744: entering _queue_task() for managed-node2/debug 37031 1727204394.95967: worker is 1 (out of 1 available) 37031 1727204394.95982: exiting _queue_task() for managed-node2/debug 37031 1727204394.95994: done queuing things up, now waiting for results queue to drain 37031 1727204394.95996: waiting for pending results... 37031 1727204394.96180: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 37031 1727204394.96271: in run() - task 0affcd87-79f5-b754-dfb8-00000000002a 37031 1727204394.96287: variable 'ansible_search_path' from source: unknown 37031 1727204394.96291: variable 'ansible_search_path' from source: unknown 37031 1727204394.96322: calling self._execute() 37031 1727204394.96394: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.96397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.96407: variable 'omit' from source: magic vars 37031 1727204394.96683: variable 'ansible_distribution_major_version' from source: facts 37031 1727204394.96692: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204394.96697: variable 'omit' from source: magic vars 37031 1727204394.96736: variable 'omit' from source: magic vars 37031 1727204394.96762: variable 'omit' from source: magic vars 37031 1727204394.96796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204394.96824: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204394.96841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204394.96854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204394.96867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204394.96891: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204394.96894: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.96897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.96967: Set connection var ansible_connection to ssh 37031 1727204394.96970: Set connection var ansible_shell_type to sh 37031 1727204394.96975: Set connection var ansible_pipelining to False 37031 1727204394.96983: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204394.96988: Set connection var ansible_timeout to 10 37031 1727204394.96994: Set connection var ansible_shell_executable to /bin/sh 37031 1727204394.97013: variable 'ansible_shell_executable' from source: unknown 37031 1727204394.97015: variable 'ansible_connection' from source: unknown 37031 1727204394.97018: variable 'ansible_module_compression' from source: unknown 37031 1727204394.97021: variable 'ansible_shell_type' from source: unknown 37031 1727204394.97023: variable 'ansible_shell_executable' from source: unknown 37031 1727204394.97026: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204394.97029: variable 'ansible_pipelining' from source: unknown 37031 1727204394.97031: variable 'ansible_timeout' from source: unknown 37031 1727204394.97035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204394.97138: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204394.97150: variable 'omit' from source: magic vars 37031 1727204394.97155: starting attempt loop 37031 1727204394.97160: running the handler 37031 1727204394.97202: variable '__network_connections_result' from source: set_fact 37031 1727204394.97262: variable '__network_connections_result' from source: set_fact 37031 1727204394.97352: handler run complete 37031 1727204394.97378: attempt loop complete, returning result 37031 1727204394.97384: _execute() done 37031 1727204394.97388: dumping result to json 37031 1727204394.97392: done dumping result, returning 37031 1727204394.97399: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-b754-dfb8-00000000002a] 37031 1727204394.97404: sending task result for task 0affcd87-79f5-b754-dfb8-00000000002a 37031 1727204394.97497: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000002a 37031 1727204394.97500: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 3d37e8b2-4205-4a19-9842-5a81810c6006 (not-active)" ] } } 37031 1727204394.97618: no more pending results, returning what we have 37031 1727204394.97621: results queue empty 37031 1727204394.97622: checking for any_errors_fatal 37031 1727204394.97627: done checking for any_errors_fatal 37031 1727204394.97628: checking for max_fail_percentage 37031 1727204394.97629: done checking for max_fail_percentage 37031 1727204394.97630: checking to see if all hosts have failed and the running result is not ok 37031 1727204394.97631: done checking to see if all hosts have failed 37031 1727204394.97631: getting the remaining hosts for this loop 37031 1727204394.97633: done getting the remaining hosts for this loop 37031 1727204394.97637: getting the next task for host managed-node2 37031 1727204394.97642: done getting next task for host managed-node2 37031 1727204394.97646: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 37031 1727204394.97649: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204394.97660: getting variables 37031 1727204394.97661: in VariableManager get_vars() 37031 1727204394.97694: Calling all_inventory to load vars for managed-node2 37031 1727204394.97702: Calling groups_inventory to load vars for managed-node2 37031 1727204394.97703: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204394.97710: Calling all_plugins_play to load vars for managed-node2 37031 1727204394.97712: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204394.97714: Calling groups_plugins_play to load vars for managed-node2 37031 1727204394.98485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204394.99409: done with get_vars() 37031 1727204394.99427: done getting variables 37031 1727204394.99474: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:59:54 -0400 (0:00:00.037) 0:00:17.539 ***** 37031 1727204394.99498: entering _queue_task() for managed-node2/debug 37031 1727204394.99722: worker is 1 (out of 1 available) 37031 1727204394.99737: exiting _queue_task() for managed-node2/debug 37031 1727204394.99749: done queuing things up, now waiting for results queue to drain 37031 1727204394.99750: waiting for pending results... 37031 1727204394.99933: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 37031 1727204395.00031: in run() - task 0affcd87-79f5-b754-dfb8-00000000002b 37031 1727204395.00044: variable 'ansible_search_path' from source: unknown 37031 1727204395.00047: variable 'ansible_search_path' from source: unknown 37031 1727204395.00082: calling self._execute() 37031 1727204395.00147: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.00150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.00157: variable 'omit' from source: magic vars 37031 1727204395.00439: variable 'ansible_distribution_major_version' from source: facts 37031 1727204395.00449: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204395.00535: variable 'network_state' from source: role '' defaults 37031 1727204395.00550: Evaluated conditional (network_state != {}): False 37031 1727204395.00566: when evaluation is False, skipping this task 37031 1727204395.00569: _execute() done 37031 1727204395.00572: dumping result to json 37031 1727204395.00575: done dumping result, returning 37031 1727204395.00581: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-b754-dfb8-00000000002b] 37031 1727204395.00585: sending task result for task 0affcd87-79f5-b754-dfb8-00000000002b 37031 1727204395.00679: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000002b 37031 1727204395.00683: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 37031 1727204395.00729: no more pending results, returning what we have 37031 1727204395.00733: results queue empty 37031 1727204395.00733: checking for any_errors_fatal 37031 1727204395.00742: done checking for any_errors_fatal 37031 1727204395.00743: checking for max_fail_percentage 37031 1727204395.00744: done checking for max_fail_percentage 37031 1727204395.00745: checking to see if all hosts have failed and the running result is not ok 37031 1727204395.00746: done checking to see if all hosts have failed 37031 1727204395.00747: getting the remaining hosts for this loop 37031 1727204395.00748: done getting the remaining hosts for this loop 37031 1727204395.00752: getting the next task for host managed-node2 37031 1727204395.00757: done getting next task for host managed-node2 37031 1727204395.00761: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 37031 1727204395.00766: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204395.00780: getting variables 37031 1727204395.00782: in VariableManager get_vars() 37031 1727204395.00821: Calling all_inventory to load vars for managed-node2 37031 1727204395.00824: Calling groups_inventory to load vars for managed-node2 37031 1727204395.00830: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.00839: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.00841: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.00843: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.01687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.02580: done with get_vars() 37031 1727204395.02594: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:59:55 -0400 (0:00:00.031) 0:00:17.571 ***** 37031 1727204395.02662: entering _queue_task() for managed-node2/ping 37031 1727204395.02665: Creating lock for ping 37031 1727204395.02876: worker is 1 (out of 1 available) 37031 1727204395.02892: exiting _queue_task() for managed-node2/ping 37031 1727204395.02904: done queuing things up, now waiting for results queue to drain 37031 1727204395.02906: waiting for pending results... 37031 1727204395.03084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 37031 1727204395.03172: in run() - task 0affcd87-79f5-b754-dfb8-00000000002c 37031 1727204395.03184: variable 'ansible_search_path' from source: unknown 37031 1727204395.03187: variable 'ansible_search_path' from source: unknown 37031 1727204395.03216: calling self._execute() 37031 1727204395.03289: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.03293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.03298: variable 'omit' from source: magic vars 37031 1727204395.03566: variable 'ansible_distribution_major_version' from source: facts 37031 1727204395.03578: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204395.03583: variable 'omit' from source: magic vars 37031 1727204395.03619: variable 'omit' from source: magic vars 37031 1727204395.03644: variable 'omit' from source: magic vars 37031 1727204395.03679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204395.03705: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204395.03723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204395.03735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204395.03747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204395.03771: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204395.03776: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.03778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.03841: Set connection var ansible_connection to ssh 37031 1727204395.03845: Set connection var ansible_shell_type to sh 37031 1727204395.03851: Set connection var ansible_pipelining to False 37031 1727204395.03857: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204395.03868: Set connection var ansible_timeout to 10 37031 1727204395.03872: Set connection var ansible_shell_executable to /bin/sh 37031 1727204395.03893: variable 'ansible_shell_executable' from source: unknown 37031 1727204395.03896: variable 'ansible_connection' from source: unknown 37031 1727204395.03899: variable 'ansible_module_compression' from source: unknown 37031 1727204395.03901: variable 'ansible_shell_type' from source: unknown 37031 1727204395.03903: variable 'ansible_shell_executable' from source: unknown 37031 1727204395.03906: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.03912: variable 'ansible_pipelining' from source: unknown 37031 1727204395.03914: variable 'ansible_timeout' from source: unknown 37031 1727204395.03918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.04067: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204395.04076: variable 'omit' from source: magic vars 37031 1727204395.04081: starting attempt loop 37031 1727204395.04083: running the handler 37031 1727204395.04095: _low_level_execute_command(): starting 37031 1727204395.04101: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204395.04626: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.04643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.04665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204395.04677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.04723: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.04739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.04789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.06403: stdout chunk (state=3): >>>/root <<< 37031 1727204395.06505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.06560: stderr chunk (state=3): >>><<< 37031 1727204395.06563: stdout chunk (state=3): >>><<< 37031 1727204395.06585: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.06595: _low_level_execute_command(): starting 37031 1727204395.06600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376 `" && echo ansible-tmp-1727204395.065836-38122-89888342803376="` echo /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376 `" ) && sleep 0' 37031 1727204395.07036: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.07049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.07074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204395.07089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.07135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.07147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.07196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.09038: stdout chunk (state=3): >>>ansible-tmp-1727204395.065836-38122-89888342803376=/root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376 <<< 37031 1727204395.09149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.09202: stderr chunk (state=3): >>><<< 37031 1727204395.09205: stdout chunk (state=3): >>><<< 37031 1727204395.09221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204395.065836-38122-89888342803376=/root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.09258: variable 'ansible_module_compression' from source: unknown 37031 1727204395.09295: ANSIBALLZ: Using lock for ping 37031 1727204395.09298: ANSIBALLZ: Acquiring lock 37031 1727204395.09301: ANSIBALLZ: Lock acquired: 140694167863952 37031 1727204395.09303: ANSIBALLZ: Creating module 37031 1727204395.17396: ANSIBALLZ: Writing module into payload 37031 1727204395.17441: ANSIBALLZ: Writing module 37031 1727204395.17458: ANSIBALLZ: Renaming module 37031 1727204395.17467: ANSIBALLZ: Done creating module 37031 1727204395.17486: variable 'ansible_facts' from source: unknown 37031 1727204395.17527: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/AnsiballZ_ping.py 37031 1727204395.17644: Sending initial data 37031 1727204395.17647: Sent initial data (151 bytes) 37031 1727204395.18343: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.18347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.18386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204395.18389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.18391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.18447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.18451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.18453: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.18499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.20260: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204395.20301: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 37031 1727204395.20304: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204395.20338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpwauhdhx8 /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/AnsiballZ_ping.py <<< 37031 1727204395.20374: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204395.21119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.21218: stderr chunk (state=3): >>><<< 37031 1727204395.21222: stdout chunk (state=3): >>><<< 37031 1727204395.21238: done transferring module to remote 37031 1727204395.21247: _low_level_execute_command(): starting 37031 1727204395.21252: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/ /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/AnsiballZ_ping.py && sleep 0' 37031 1727204395.21688: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.21707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.21722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.21732: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.21782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.21794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.21840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.23544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.23625: stderr chunk (state=3): >>><<< 37031 1727204395.23628: stdout chunk (state=3): >>><<< 37031 1727204395.23673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.23682: _low_level_execute_command(): starting 37031 1727204395.23684: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/AnsiballZ_ping.py && sleep 0' 37031 1727204395.24300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204395.24315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.24331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.24350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.24396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204395.24411: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204395.24426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.24447: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204395.24466: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204395.24480: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204395.24494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.24509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.24527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.24540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204395.24554: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204395.24575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.24649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.24676: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.24693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.24775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.37763: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 37031 1727204395.38995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204395.38999: stdout chunk (state=3): >>><<< 37031 1727204395.39002: stderr chunk (state=3): >>><<< 37031 1727204395.39127: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204395.39131: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204395.39134: _low_level_execute_command(): starting 37031 1727204395.39136: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204395.065836-38122-89888342803376/ > /dev/null 2>&1 && sleep 0' 37031 1727204395.40889: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204395.40903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.40916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.40933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.40984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204395.41081: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204395.41095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.41112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204395.41127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204395.41140: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204395.41150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.41165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.41183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.41195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204395.41204: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204395.41216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.41297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204395.41382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.41396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.41525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.43300: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.43303: stdout chunk (state=3): >>><<< 37031 1727204395.43306: stderr chunk (state=3): >>><<< 37031 1727204395.43672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.43680: handler run complete 37031 1727204395.43682: attempt loop complete, returning result 37031 1727204395.43684: _execute() done 37031 1727204395.43687: dumping result to json 37031 1727204395.43688: done dumping result, returning 37031 1727204395.43691: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-b754-dfb8-00000000002c] 37031 1727204395.43692: sending task result for task 0affcd87-79f5-b754-dfb8-00000000002c 37031 1727204395.43761: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000002c 37031 1727204395.43766: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 37031 1727204395.43828: no more pending results, returning what we have 37031 1727204395.43831: results queue empty 37031 1727204395.43832: checking for any_errors_fatal 37031 1727204395.43837: done checking for any_errors_fatal 37031 1727204395.43837: checking for max_fail_percentage 37031 1727204395.43839: done checking for max_fail_percentage 37031 1727204395.43840: checking to see if all hosts have failed and the running result is not ok 37031 1727204395.43841: done checking to see if all hosts have failed 37031 1727204395.43842: getting the remaining hosts for this loop 37031 1727204395.43843: done getting the remaining hosts for this loop 37031 1727204395.43847: getting the next task for host managed-node2 37031 1727204395.43856: done getting next task for host managed-node2 37031 1727204395.43858: ^ task is: TASK: meta (role_complete) 37031 1727204395.43861: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204395.43874: getting variables 37031 1727204395.43876: in VariableManager get_vars() 37031 1727204395.43920: Calling all_inventory to load vars for managed-node2 37031 1727204395.43922: Calling groups_inventory to load vars for managed-node2 37031 1727204395.43925: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.43935: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.43937: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.43940: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.45786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.47725: done with get_vars() 37031 1727204395.47752: done getting variables 37031 1727204395.47834: done queuing things up, now waiting for results queue to drain 37031 1727204395.47837: results queue empty 37031 1727204395.47838: checking for any_errors_fatal 37031 1727204395.47840: done checking for any_errors_fatal 37031 1727204395.47841: checking for max_fail_percentage 37031 1727204395.47842: done checking for max_fail_percentage 37031 1727204395.47843: checking to see if all hosts have failed and the running result is not ok 37031 1727204395.47843: done checking to see if all hosts have failed 37031 1727204395.47844: getting the remaining hosts for this loop 37031 1727204395.47845: done getting the remaining hosts for this loop 37031 1727204395.47848: getting the next task for host managed-node2 37031 1727204395.47856: done getting next task for host managed-node2 37031 1727204395.47859: ^ task is: TASK: Include the task 'assert_device_present.yml' 37031 1727204395.47860: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204395.47862: getting variables 37031 1727204395.47863: in VariableManager get_vars() 37031 1727204395.47879: Calling all_inventory to load vars for managed-node2 37031 1727204395.47882: Calling groups_inventory to load vars for managed-node2 37031 1727204395.47884: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.47889: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.47891: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.47894: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.49529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.52945: done with get_vars() 37031 1727204395.52976: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Tuesday 24 September 2024 14:59:55 -0400 (0:00:00.503) 0:00:18.075 ***** 37031 1727204395.53050: entering _queue_task() for managed-node2/include_tasks 37031 1727204395.53375: worker is 1 (out of 1 available) 37031 1727204395.53387: exiting _queue_task() for managed-node2/include_tasks 37031 1727204395.53399: done queuing things up, now waiting for results queue to drain 37031 1727204395.53401: waiting for pending results... 37031 1727204395.53688: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 37031 1727204395.53799: in run() - task 0affcd87-79f5-b754-dfb8-00000000005c 37031 1727204395.53817: variable 'ansible_search_path' from source: unknown 37031 1727204395.53866: calling self._execute() 37031 1727204395.53968: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.53980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.53993: variable 'omit' from source: magic vars 37031 1727204395.54666: variable 'ansible_distribution_major_version' from source: facts 37031 1727204395.54684: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204395.54694: _execute() done 37031 1727204395.54702: dumping result to json 37031 1727204395.54709: done dumping result, returning 37031 1727204395.54719: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [0affcd87-79f5-b754-dfb8-00000000005c] 37031 1727204395.54729: sending task result for task 0affcd87-79f5-b754-dfb8-00000000005c 37031 1727204395.54852: no more pending results, returning what we have 37031 1727204395.54860: in VariableManager get_vars() 37031 1727204395.54913: Calling all_inventory to load vars for managed-node2 37031 1727204395.54916: Calling groups_inventory to load vars for managed-node2 37031 1727204395.54918: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.54930: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.54933: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.54936: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.56282: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000005c 37031 1727204395.56285: WORKER PROCESS EXITING 37031 1727204395.56854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.58561: done with get_vars() 37031 1727204395.58633: variable 'ansible_search_path' from source: unknown 37031 1727204395.58649: we have included files to process 37031 1727204395.58650: generating all_blocks data 37031 1727204395.58652: done generating all_blocks data 37031 1727204395.58660: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 37031 1727204395.58661: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 37031 1727204395.58663: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 37031 1727204395.59292: in VariableManager get_vars() 37031 1727204395.59319: done with get_vars() 37031 1727204395.59437: done processing included file 37031 1727204395.59440: iterating over new_blocks loaded from include file 37031 1727204395.59441: in VariableManager get_vars() 37031 1727204395.59460: done with get_vars() 37031 1727204395.59461: filtering new block on tags 37031 1727204395.59482: done filtering new block on tags 37031 1727204395.59485: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 37031 1727204395.59490: extending task lists for all hosts with included blocks 37031 1727204395.62196: done extending task lists 37031 1727204395.62198: done processing included files 37031 1727204395.62199: results queue empty 37031 1727204395.62200: checking for any_errors_fatal 37031 1727204395.62201: done checking for any_errors_fatal 37031 1727204395.62202: checking for max_fail_percentage 37031 1727204395.62203: done checking for max_fail_percentage 37031 1727204395.62204: checking to see if all hosts have failed and the running result is not ok 37031 1727204395.62205: done checking to see if all hosts have failed 37031 1727204395.62206: getting the remaining hosts for this loop 37031 1727204395.62207: done getting the remaining hosts for this loop 37031 1727204395.62209: getting the next task for host managed-node2 37031 1727204395.62213: done getting next task for host managed-node2 37031 1727204395.62216: ^ task is: TASK: Include the task 'get_interface_stat.yml' 37031 1727204395.62218: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204395.62221: getting variables 37031 1727204395.62221: in VariableManager get_vars() 37031 1727204395.62236: Calling all_inventory to load vars for managed-node2 37031 1727204395.62238: Calling groups_inventory to load vars for managed-node2 37031 1727204395.62240: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.62246: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.62248: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.62251: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.63850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.65638: done with get_vars() 37031 1727204395.65668: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:59:55 -0400 (0:00:00.126) 0:00:18.202 ***** 37031 1727204395.65752: entering _queue_task() for managed-node2/include_tasks 37031 1727204395.66089: worker is 1 (out of 1 available) 37031 1727204395.66099: exiting _queue_task() for managed-node2/include_tasks 37031 1727204395.66111: done queuing things up, now waiting for results queue to drain 37031 1727204395.66112: waiting for pending results... 37031 1727204395.66475: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 37031 1727204395.66598: in run() - task 0affcd87-79f5-b754-dfb8-0000000002b5 37031 1727204395.66616: variable 'ansible_search_path' from source: unknown 37031 1727204395.66622: variable 'ansible_search_path' from source: unknown 37031 1727204395.66676: calling self._execute() 37031 1727204395.66767: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.66783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.66797: variable 'omit' from source: magic vars 37031 1727204395.67361: variable 'ansible_distribution_major_version' from source: facts 37031 1727204395.67386: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204395.67396: _execute() done 37031 1727204395.67403: dumping result to json 37031 1727204395.67412: done dumping result, returning 37031 1727204395.67424: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-b754-dfb8-0000000002b5] 37031 1727204395.67432: sending task result for task 0affcd87-79f5-b754-dfb8-0000000002b5 37031 1727204395.67541: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000002b5 37031 1727204395.67548: WORKER PROCESS EXITING 37031 1727204395.67691: no more pending results, returning what we have 37031 1727204395.67696: in VariableManager get_vars() 37031 1727204395.67742: Calling all_inventory to load vars for managed-node2 37031 1727204395.67745: Calling groups_inventory to load vars for managed-node2 37031 1727204395.67747: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.67761: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.67769: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.67773: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.69292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.71062: done with get_vars() 37031 1727204395.71091: variable 'ansible_search_path' from source: unknown 37031 1727204395.71093: variable 'ansible_search_path' from source: unknown 37031 1727204395.71135: we have included files to process 37031 1727204395.71136: generating all_blocks data 37031 1727204395.71138: done generating all_blocks data 37031 1727204395.71140: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 37031 1727204395.71141: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 37031 1727204395.71143: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 37031 1727204395.71426: done processing included file 37031 1727204395.71428: iterating over new_blocks loaded from include file 37031 1727204395.71430: in VariableManager get_vars() 37031 1727204395.71451: done with get_vars() 37031 1727204395.71453: filtering new block on tags 37031 1727204395.71473: done filtering new block on tags 37031 1727204395.71475: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 37031 1727204395.71481: extending task lists for all hosts with included blocks 37031 1727204395.71592: done extending task lists 37031 1727204395.71594: done processing included files 37031 1727204395.71595: results queue empty 37031 1727204395.71595: checking for any_errors_fatal 37031 1727204395.71599: done checking for any_errors_fatal 37031 1727204395.71600: checking for max_fail_percentage 37031 1727204395.71601: done checking for max_fail_percentage 37031 1727204395.71602: checking to see if all hosts have failed and the running result is not ok 37031 1727204395.71603: done checking to see if all hosts have failed 37031 1727204395.71603: getting the remaining hosts for this loop 37031 1727204395.71604: done getting the remaining hosts for this loop 37031 1727204395.71607: getting the next task for host managed-node2 37031 1727204395.71611: done getting next task for host managed-node2 37031 1727204395.71613: ^ task is: TASK: Get stat for interface {{ interface }} 37031 1727204395.71616: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204395.71618: getting variables 37031 1727204395.71619: in VariableManager get_vars() 37031 1727204395.71634: Calling all_inventory to load vars for managed-node2 37031 1727204395.71637: Calling groups_inventory to load vars for managed-node2 37031 1727204395.71639: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204395.71644: Calling all_plugins_play to load vars for managed-node2 37031 1727204395.71647: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204395.71649: Calling groups_plugins_play to load vars for managed-node2 37031 1727204395.73185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204395.74772: done with get_vars() 37031 1727204395.74800: done getting variables 37031 1727204395.74999: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:59:55 -0400 (0:00:00.092) 0:00:18.295 ***** 37031 1727204395.75032: entering _queue_task() for managed-node2/stat 37031 1727204395.75379: worker is 1 (out of 1 available) 37031 1727204395.75392: exiting _queue_task() for managed-node2/stat 37031 1727204395.75403: done queuing things up, now waiting for results queue to drain 37031 1727204395.75405: waiting for pending results... 37031 1727204395.75708: running TaskExecutor() for managed-node2/TASK: Get stat for interface veth0 37031 1727204395.75837: in run() - task 0affcd87-79f5-b754-dfb8-0000000003a0 37031 1727204395.75867: variable 'ansible_search_path' from source: unknown 37031 1727204395.75877: variable 'ansible_search_path' from source: unknown 37031 1727204395.75918: calling self._execute() 37031 1727204395.76022: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.76034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.76050: variable 'omit' from source: magic vars 37031 1727204395.76453: variable 'ansible_distribution_major_version' from source: facts 37031 1727204395.76479: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204395.76493: variable 'omit' from source: magic vars 37031 1727204395.76543: variable 'omit' from source: magic vars 37031 1727204395.76648: variable 'interface' from source: play vars 37031 1727204395.76675: variable 'omit' from source: magic vars 37031 1727204395.76722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204395.76768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204395.76796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204395.76820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204395.76851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204395.76891: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204395.76901: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.76910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.77019: Set connection var ansible_connection to ssh 37031 1727204395.77031: Set connection var ansible_shell_type to sh 37031 1727204395.77044: Set connection var ansible_pipelining to False 37031 1727204395.77052: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204395.77059: Set connection var ansible_timeout to 10 37031 1727204395.77063: Set connection var ansible_shell_executable to /bin/sh 37031 1727204395.77085: variable 'ansible_shell_executable' from source: unknown 37031 1727204395.77088: variable 'ansible_connection' from source: unknown 37031 1727204395.77091: variable 'ansible_module_compression' from source: unknown 37031 1727204395.77094: variable 'ansible_shell_type' from source: unknown 37031 1727204395.77096: variable 'ansible_shell_executable' from source: unknown 37031 1727204395.77098: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204395.77100: variable 'ansible_pipelining' from source: unknown 37031 1727204395.77103: variable 'ansible_timeout' from source: unknown 37031 1727204395.77105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204395.77254: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204395.77282: variable 'omit' from source: magic vars 37031 1727204395.77287: starting attempt loop 37031 1727204395.77290: running the handler 37031 1727204395.77302: _low_level_execute_command(): starting 37031 1727204395.77308: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204395.77817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.77835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.77849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.77862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.77914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.77934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.77975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.79621: stdout chunk (state=3): >>>/root <<< 37031 1727204395.79726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.79787: stderr chunk (state=3): >>><<< 37031 1727204395.79791: stdout chunk (state=3): >>><<< 37031 1727204395.79811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.79823: _low_level_execute_command(): starting 37031 1727204395.79828: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087 `" && echo ansible-tmp-1727204395.7981114-38192-243887911645087="` echo /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087 `" ) && sleep 0' 37031 1727204395.80479: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.80498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.80553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.80571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.80639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.82486: stdout chunk (state=3): >>>ansible-tmp-1727204395.7981114-38192-243887911645087=/root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087 <<< 37031 1727204395.82600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.82653: stderr chunk (state=3): >>><<< 37031 1727204395.82660: stdout chunk (state=3): >>><<< 37031 1727204395.82677: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204395.7981114-38192-243887911645087=/root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.82715: variable 'ansible_module_compression' from source: unknown 37031 1727204395.82760: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 37031 1727204395.82795: variable 'ansible_facts' from source: unknown 37031 1727204395.82853: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/AnsiballZ_stat.py 37031 1727204395.82956: Sending initial data 37031 1727204395.82972: Sent initial data (153 bytes) 37031 1727204395.83648: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.83651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.83688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.83691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204395.83693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.83739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.83751: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.83797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.85501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204395.85533: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204395.85573: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpq9s1yl0f /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/AnsiballZ_stat.py <<< 37031 1727204395.85610: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204395.86375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.86483: stderr chunk (state=3): >>><<< 37031 1727204395.86486: stdout chunk (state=3): >>><<< 37031 1727204395.86503: done transferring module to remote 37031 1727204395.86512: _low_level_execute_command(): starting 37031 1727204395.86517: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/ /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/AnsiballZ_stat.py && sleep 0' 37031 1727204395.87082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 37031 1727204395.87094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.87176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.87207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204395.88914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204395.88967: stderr chunk (state=3): >>><<< 37031 1727204395.88970: stdout chunk (state=3): >>><<< 37031 1727204395.88985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204395.88992: _low_level_execute_command(): starting 37031 1727204395.88994: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/AnsiballZ_stat.py && sleep 0' 37031 1727204395.89446: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204395.89450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204395.89487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.89491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204395.89493: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204395.89544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204395.89547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204395.89598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.02840: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31587, "dev": 21, "nlink": 1, "atime": 1727204384.821291, "mtime": 1727204384.821291, "ctime": 1727204384.821291, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 37031 1727204396.03862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204396.03926: stderr chunk (state=3): >>><<< 37031 1727204396.03930: stdout chunk (state=3): >>><<< 37031 1727204396.03946: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 31587, "dev": 21, "nlink": 1, "atime": 1727204384.821291, "mtime": 1727204384.821291, "ctime": 1727204384.821291, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204396.03993: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204396.04001: _low_level_execute_command(): starting 37031 1727204396.04005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204395.7981114-38192-243887911645087/ > /dev/null 2>&1 && sleep 0' 37031 1727204396.04478: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.04482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.04516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.04530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.04541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.04587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204396.04600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.04653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.06473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.06523: stderr chunk (state=3): >>><<< 37031 1727204396.06526: stdout chunk (state=3): >>><<< 37031 1727204396.06541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204396.06552: handler run complete 37031 1727204396.06592: attempt loop complete, returning result 37031 1727204396.06595: _execute() done 37031 1727204396.06598: dumping result to json 37031 1727204396.06603: done dumping result, returning 37031 1727204396.06610: done running TaskExecutor() for managed-node2/TASK: Get stat for interface veth0 [0affcd87-79f5-b754-dfb8-0000000003a0] 37031 1727204396.06614: sending task result for task 0affcd87-79f5-b754-dfb8-0000000003a0 37031 1727204396.06718: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000003a0 37031 1727204396.06720: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204384.821291, "block_size": 4096, "blocks": 0, "ctime": 1727204384.821291, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 31587, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1727204384.821291, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 37031 1727204396.06807: no more pending results, returning what we have 37031 1727204396.06811: results queue empty 37031 1727204396.06812: checking for any_errors_fatal 37031 1727204396.06814: done checking for any_errors_fatal 37031 1727204396.06814: checking for max_fail_percentage 37031 1727204396.06816: done checking for max_fail_percentage 37031 1727204396.06817: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.06818: done checking to see if all hosts have failed 37031 1727204396.06819: getting the remaining hosts for this loop 37031 1727204396.06820: done getting the remaining hosts for this loop 37031 1727204396.06824: getting the next task for host managed-node2 37031 1727204396.06832: done getting next task for host managed-node2 37031 1727204396.06835: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 37031 1727204396.06838: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.06842: getting variables 37031 1727204396.06843: in VariableManager get_vars() 37031 1727204396.06888: Calling all_inventory to load vars for managed-node2 37031 1727204396.06891: Calling groups_inventory to load vars for managed-node2 37031 1727204396.06893: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.06903: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.06905: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.06907: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.07705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.08603: done with get_vars() 37031 1727204396.08618: done getting variables 37031 1727204396.08696: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 37031 1727204396.08784: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:59:56 -0400 (0:00:00.337) 0:00:18.632 ***** 37031 1727204396.08807: entering _queue_task() for managed-node2/assert 37031 1727204396.08808: Creating lock for assert 37031 1727204396.09038: worker is 1 (out of 1 available) 37031 1727204396.09052: exiting _queue_task() for managed-node2/assert 37031 1727204396.09069: done queuing things up, now waiting for results queue to drain 37031 1727204396.09070: waiting for pending results... 37031 1727204396.09242: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'veth0' 37031 1727204396.09306: in run() - task 0affcd87-79f5-b754-dfb8-0000000002b6 37031 1727204396.09316: variable 'ansible_search_path' from source: unknown 37031 1727204396.09320: variable 'ansible_search_path' from source: unknown 37031 1727204396.09348: calling self._execute() 37031 1727204396.09419: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.09424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.09433: variable 'omit' from source: magic vars 37031 1727204396.09701: variable 'ansible_distribution_major_version' from source: facts 37031 1727204396.09711: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204396.09716: variable 'omit' from source: magic vars 37031 1727204396.09746: variable 'omit' from source: magic vars 37031 1727204396.09816: variable 'interface' from source: play vars 37031 1727204396.09829: variable 'omit' from source: magic vars 37031 1727204396.09863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204396.09890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204396.09910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204396.09923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.09932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.09957: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204396.09961: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.09965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.10028: Set connection var ansible_connection to ssh 37031 1727204396.10032: Set connection var ansible_shell_type to sh 37031 1727204396.10038: Set connection var ansible_pipelining to False 37031 1727204396.10045: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204396.10050: Set connection var ansible_timeout to 10 37031 1727204396.10059: Set connection var ansible_shell_executable to /bin/sh 37031 1727204396.10079: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.10082: variable 'ansible_connection' from source: unknown 37031 1727204396.10085: variable 'ansible_module_compression' from source: unknown 37031 1727204396.10087: variable 'ansible_shell_type' from source: unknown 37031 1727204396.10089: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.10091: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.10095: variable 'ansible_pipelining' from source: unknown 37031 1727204396.10098: variable 'ansible_timeout' from source: unknown 37031 1727204396.10102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.10202: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204396.10211: variable 'omit' from source: magic vars 37031 1727204396.10217: starting attempt loop 37031 1727204396.10220: running the handler 37031 1727204396.10311: variable 'interface_stat' from source: set_fact 37031 1727204396.10326: Evaluated conditional (interface_stat.stat.exists): True 37031 1727204396.10331: handler run complete 37031 1727204396.10344: attempt loop complete, returning result 37031 1727204396.10347: _execute() done 37031 1727204396.10350: dumping result to json 37031 1727204396.10352: done dumping result, returning 37031 1727204396.10360: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'veth0' [0affcd87-79f5-b754-dfb8-0000000002b6] 37031 1727204396.10363: sending task result for task 0affcd87-79f5-b754-dfb8-0000000002b6 37031 1727204396.10441: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000002b6 37031 1727204396.10449: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204396.10526: no more pending results, returning what we have 37031 1727204396.10530: results queue empty 37031 1727204396.10531: checking for any_errors_fatal 37031 1727204396.10536: done checking for any_errors_fatal 37031 1727204396.10537: checking for max_fail_percentage 37031 1727204396.10539: done checking for max_fail_percentage 37031 1727204396.10540: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.10541: done checking to see if all hosts have failed 37031 1727204396.10541: getting the remaining hosts for this loop 37031 1727204396.10543: done getting the remaining hosts for this loop 37031 1727204396.10547: getting the next task for host managed-node2 37031 1727204396.10565: done getting next task for host managed-node2 37031 1727204396.10568: ^ task is: TASK: Include the task 'assert_profile_present.yml' 37031 1727204396.10570: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.10574: getting variables 37031 1727204396.10576: in VariableManager get_vars() 37031 1727204396.10610: Calling all_inventory to load vars for managed-node2 37031 1727204396.10612: Calling groups_inventory to load vars for managed-node2 37031 1727204396.10614: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.10623: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.10626: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.10628: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.14502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.15390: done with get_vars() 37031 1727204396.15407: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Tuesday 24 September 2024 14:59:56 -0400 (0:00:00.066) 0:00:18.699 ***** 37031 1727204396.15467: entering _queue_task() for managed-node2/include_tasks 37031 1727204396.15701: worker is 1 (out of 1 available) 37031 1727204396.15714: exiting _queue_task() for managed-node2/include_tasks 37031 1727204396.15727: done queuing things up, now waiting for results queue to drain 37031 1727204396.15728: waiting for pending results... 37031 1727204396.15910: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' 37031 1727204396.15977: in run() - task 0affcd87-79f5-b754-dfb8-00000000005d 37031 1727204396.15987: variable 'ansible_search_path' from source: unknown 37031 1727204396.16017: calling self._execute() 37031 1727204396.16089: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.16093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.16101: variable 'omit' from source: magic vars 37031 1727204396.16374: variable 'ansible_distribution_major_version' from source: facts 37031 1727204396.16385: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204396.16389: _execute() done 37031 1727204396.16392: dumping result to json 37031 1727204396.16396: done dumping result, returning 37031 1727204396.16404: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_present.yml' [0affcd87-79f5-b754-dfb8-00000000005d] 37031 1727204396.16408: sending task result for task 0affcd87-79f5-b754-dfb8-00000000005d 37031 1727204396.16503: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000005d 37031 1727204396.16507: WORKER PROCESS EXITING 37031 1727204396.16539: no more pending results, returning what we have 37031 1727204396.16543: in VariableManager get_vars() 37031 1727204396.16596: Calling all_inventory to load vars for managed-node2 37031 1727204396.16598: Calling groups_inventory to load vars for managed-node2 37031 1727204396.16600: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.16612: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.16615: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.16623: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.17406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.18324: done with get_vars() 37031 1727204396.18337: variable 'ansible_search_path' from source: unknown 37031 1727204396.18350: we have included files to process 37031 1727204396.18351: generating all_blocks data 37031 1727204396.18352: done generating all_blocks data 37031 1727204396.18357: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 37031 1727204396.18358: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 37031 1727204396.18360: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 37031 1727204396.18498: in VariableManager get_vars() 37031 1727204396.18514: done with get_vars() 37031 1727204396.18692: done processing included file 37031 1727204396.18694: iterating over new_blocks loaded from include file 37031 1727204396.18695: in VariableManager get_vars() 37031 1727204396.18706: done with get_vars() 37031 1727204396.18708: filtering new block on tags 37031 1727204396.18721: done filtering new block on tags 37031 1727204396.18723: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node2 37031 1727204396.18726: extending task lists for all hosts with included blocks 37031 1727204396.20229: done extending task lists 37031 1727204396.20230: done processing included files 37031 1727204396.20230: results queue empty 37031 1727204396.20231: checking for any_errors_fatal 37031 1727204396.20234: done checking for any_errors_fatal 37031 1727204396.20234: checking for max_fail_percentage 37031 1727204396.20235: done checking for max_fail_percentage 37031 1727204396.20235: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.20236: done checking to see if all hosts have failed 37031 1727204396.20237: getting the remaining hosts for this loop 37031 1727204396.20238: done getting the remaining hosts for this loop 37031 1727204396.20240: getting the next task for host managed-node2 37031 1727204396.20243: done getting next task for host managed-node2 37031 1727204396.20245: ^ task is: TASK: Include the task 'get_profile_stat.yml' 37031 1727204396.20246: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.20248: getting variables 37031 1727204396.20249: in VariableManager get_vars() 37031 1727204396.20259: Calling all_inventory to load vars for managed-node2 37031 1727204396.20261: Calling groups_inventory to load vars for managed-node2 37031 1727204396.20262: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.20268: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.20269: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.20271: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.21183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.22850: done with get_vars() 37031 1727204396.22874: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:59:56 -0400 (0:00:00.074) 0:00:18.774 ***** 37031 1727204396.22954: entering _queue_task() for managed-node2/include_tasks 37031 1727204396.23279: worker is 1 (out of 1 available) 37031 1727204396.23291: exiting _queue_task() for managed-node2/include_tasks 37031 1727204396.23304: done queuing things up, now waiting for results queue to drain 37031 1727204396.23305: waiting for pending results... 37031 1727204396.23588: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 37031 1727204396.23691: in run() - task 0affcd87-79f5-b754-dfb8-0000000003b8 37031 1727204396.23703: variable 'ansible_search_path' from source: unknown 37031 1727204396.23707: variable 'ansible_search_path' from source: unknown 37031 1727204396.23745: calling self._execute() 37031 1727204396.23842: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.23846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.23860: variable 'omit' from source: magic vars 37031 1727204396.24236: variable 'ansible_distribution_major_version' from source: facts 37031 1727204396.24249: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204396.24254: _execute() done 37031 1727204396.24260: dumping result to json 37031 1727204396.24263: done dumping result, returning 37031 1727204396.24270: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-b754-dfb8-0000000003b8] 37031 1727204396.24272: sending task result for task 0affcd87-79f5-b754-dfb8-0000000003b8 37031 1727204396.24367: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000003b8 37031 1727204396.24371: WORKER PROCESS EXITING 37031 1727204396.24398: no more pending results, returning what we have 37031 1727204396.24404: in VariableManager get_vars() 37031 1727204396.24456: Calling all_inventory to load vars for managed-node2 37031 1727204396.24459: Calling groups_inventory to load vars for managed-node2 37031 1727204396.24462: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.24477: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.24480: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.24484: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.26369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.30577: done with get_vars() 37031 1727204396.30609: variable 'ansible_search_path' from source: unknown 37031 1727204396.30611: variable 'ansible_search_path' from source: unknown 37031 1727204396.30651: we have included files to process 37031 1727204396.30653: generating all_blocks data 37031 1727204396.30771: done generating all_blocks data 37031 1727204396.30773: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 37031 1727204396.30774: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 37031 1727204396.30778: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 37031 1727204396.33187: done processing included file 37031 1727204396.33190: iterating over new_blocks loaded from include file 37031 1727204396.33191: in VariableManager get_vars() 37031 1727204396.33214: done with get_vars() 37031 1727204396.33217: filtering new block on tags 37031 1727204396.33243: done filtering new block on tags 37031 1727204396.33246: in VariableManager get_vars() 37031 1727204396.33273: done with get_vars() 37031 1727204396.33389: filtering new block on tags 37031 1727204396.33414: done filtering new block on tags 37031 1727204396.33416: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 37031 1727204396.33422: extending task lists for all hosts with included blocks 37031 1727204396.33849: done extending task lists 37031 1727204396.33851: done processing included files 37031 1727204396.33851: results queue empty 37031 1727204396.33852: checking for any_errors_fatal 37031 1727204396.33858: done checking for any_errors_fatal 37031 1727204396.33859: checking for max_fail_percentage 37031 1727204396.33860: done checking for max_fail_percentage 37031 1727204396.33861: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.33862: done checking to see if all hosts have failed 37031 1727204396.33862: getting the remaining hosts for this loop 37031 1727204396.33866: done getting the remaining hosts for this loop 37031 1727204396.33868: getting the next task for host managed-node2 37031 1727204396.33873: done getting next task for host managed-node2 37031 1727204396.33876: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 37031 1727204396.33879: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.33882: getting variables 37031 1727204396.33883: in VariableManager get_vars() 37031 1727204396.34092: Calling all_inventory to load vars for managed-node2 37031 1727204396.34094: Calling groups_inventory to load vars for managed-node2 37031 1727204396.34096: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.34102: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.34105: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.34108: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.36893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.40576: done with get_vars() 37031 1727204396.40614: done getting variables 37031 1727204396.40669: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:59:56 -0400 (0:00:00.177) 0:00:18.951 ***** 37031 1727204396.40705: entering _queue_task() for managed-node2/set_fact 37031 1727204396.41071: worker is 1 (out of 1 available) 37031 1727204396.41083: exiting _queue_task() for managed-node2/set_fact 37031 1727204396.41098: done queuing things up, now waiting for results queue to drain 37031 1727204396.41099: waiting for pending results... 37031 1727204396.42134: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 37031 1727204396.42380: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b0 37031 1727204396.42461: variable 'ansible_search_path' from source: unknown 37031 1727204396.42473: variable 'ansible_search_path' from source: unknown 37031 1727204396.42515: calling self._execute() 37031 1727204396.42642: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.42776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.42792: variable 'omit' from source: magic vars 37031 1727204396.43607: variable 'ansible_distribution_major_version' from source: facts 37031 1727204396.43625: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204396.43747: variable 'omit' from source: magic vars 37031 1727204396.43805: variable 'omit' from source: magic vars 37031 1727204396.43845: variable 'omit' from source: magic vars 37031 1727204396.43896: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204396.43999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204396.44093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204396.44115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.44129: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.44210: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204396.44218: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.44224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.44454: Set connection var ansible_connection to ssh 37031 1727204396.44466: Set connection var ansible_shell_type to sh 37031 1727204396.44478: Set connection var ansible_pipelining to False 37031 1727204396.44489: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204396.44499: Set connection var ansible_timeout to 10 37031 1727204396.44510: Set connection var ansible_shell_executable to /bin/sh 37031 1727204396.44642: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.44651: variable 'ansible_connection' from source: unknown 37031 1727204396.44663: variable 'ansible_module_compression' from source: unknown 37031 1727204396.44672: variable 'ansible_shell_type' from source: unknown 37031 1727204396.44679: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.44686: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.44697: variable 'ansible_pipelining' from source: unknown 37031 1727204396.44705: variable 'ansible_timeout' from source: unknown 37031 1727204396.44714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.44984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204396.45063: variable 'omit' from source: magic vars 37031 1727204396.45076: starting attempt loop 37031 1727204396.45083: running the handler 37031 1727204396.45100: handler run complete 37031 1727204396.45172: attempt loop complete, returning result 37031 1727204396.45179: _execute() done 37031 1727204396.45187: dumping result to json 37031 1727204396.45195: done dumping result, returning 37031 1727204396.45205: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-b754-dfb8-0000000004b0] 37031 1727204396.45213: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b0 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 37031 1727204396.45372: no more pending results, returning what we have 37031 1727204396.45376: results queue empty 37031 1727204396.45377: checking for any_errors_fatal 37031 1727204396.45379: done checking for any_errors_fatal 37031 1727204396.45379: checking for max_fail_percentage 37031 1727204396.45381: done checking for max_fail_percentage 37031 1727204396.45382: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.45383: done checking to see if all hosts have failed 37031 1727204396.45384: getting the remaining hosts for this loop 37031 1727204396.45386: done getting the remaining hosts for this loop 37031 1727204396.45390: getting the next task for host managed-node2 37031 1727204396.45397: done getting next task for host managed-node2 37031 1727204396.45400: ^ task is: TASK: Stat profile file 37031 1727204396.45406: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.45409: getting variables 37031 1727204396.45412: in VariableManager get_vars() 37031 1727204396.45460: Calling all_inventory to load vars for managed-node2 37031 1727204396.45462: Calling groups_inventory to load vars for managed-node2 37031 1727204396.45466: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.45477: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.45479: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.45483: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.46673: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b0 37031 1727204396.46677: WORKER PROCESS EXITING 37031 1727204396.47870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204396.50966: done with get_vars() 37031 1727204396.50999: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:59:56 -0400 (0:00:00.111) 0:00:19.062 ***** 37031 1727204396.51808: entering _queue_task() for managed-node2/stat 37031 1727204396.52167: worker is 1 (out of 1 available) 37031 1727204396.52183: exiting _queue_task() for managed-node2/stat 37031 1727204396.52196: done queuing things up, now waiting for results queue to drain 37031 1727204396.52197: waiting for pending results... 37031 1727204396.52948: running TaskExecutor() for managed-node2/TASK: Stat profile file 37031 1727204396.53321: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b1 37031 1727204396.53342: variable 'ansible_search_path' from source: unknown 37031 1727204396.53716: variable 'ansible_search_path' from source: unknown 37031 1727204396.53762: calling self._execute() 37031 1727204396.53868: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.53881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.53895: variable 'omit' from source: magic vars 37031 1727204396.54792: variable 'ansible_distribution_major_version' from source: facts 37031 1727204396.54811: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204396.54822: variable 'omit' from source: magic vars 37031 1727204396.54882: variable 'omit' from source: magic vars 37031 1727204396.54988: variable 'profile' from source: include params 37031 1727204396.54998: variable 'interface' from source: play vars 37031 1727204396.55071: variable 'interface' from source: play vars 37031 1727204396.55096: variable 'omit' from source: magic vars 37031 1727204396.55144: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204396.55191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204396.55763: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204396.55791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.55807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204396.55843: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204396.55851: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.55861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.55961: Set connection var ansible_connection to ssh 37031 1727204396.55972: Set connection var ansible_shell_type to sh 37031 1727204396.55982: Set connection var ansible_pipelining to False 37031 1727204396.55993: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204396.56002: Set connection var ansible_timeout to 10 37031 1727204396.56010: Set connection var ansible_shell_executable to /bin/sh 37031 1727204396.56044: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.56051: variable 'ansible_connection' from source: unknown 37031 1727204396.56061: variable 'ansible_module_compression' from source: unknown 37031 1727204396.56072: variable 'ansible_shell_type' from source: unknown 37031 1727204396.56079: variable 'ansible_shell_executable' from source: unknown 37031 1727204396.56087: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204396.56095: variable 'ansible_pipelining' from source: unknown 37031 1727204396.56101: variable 'ansible_timeout' from source: unknown 37031 1727204396.56108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204396.56317: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204396.56335: variable 'omit' from source: magic vars 37031 1727204396.56761: starting attempt loop 37031 1727204396.56772: running the handler 37031 1727204396.56793: _low_level_execute_command(): starting 37031 1727204396.56805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204396.57581: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.57585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.57622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204396.57626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.57629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204396.57631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.57715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.57788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.58002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.59523: stdout chunk (state=3): >>>/root <<< 37031 1727204396.59624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.59713: stderr chunk (state=3): >>><<< 37031 1727204396.59716: stdout chunk (state=3): >>><<< 37031 1727204396.59829: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204396.59833: _low_level_execute_command(): starting 37031 1727204396.59836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907 `" && echo ansible-tmp-1727204396.5973685-38230-96329570462907="` echo /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907 `" ) && sleep 0' 37031 1727204396.61278: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.61282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.61315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.61329: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.61332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.61507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.61568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.61778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.63577: stdout chunk (state=3): >>>ansible-tmp-1727204396.5973685-38230-96329570462907=/root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907 <<< 37031 1727204396.63693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.63778: stderr chunk (state=3): >>><<< 37031 1727204396.63781: stdout chunk (state=3): >>><<< 37031 1727204396.64071: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204396.5973685-38230-96329570462907=/root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204396.64075: variable 'ansible_module_compression' from source: unknown 37031 1727204396.64077: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 37031 1727204396.64080: variable 'ansible_facts' from source: unknown 37031 1727204396.64082: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/AnsiballZ_stat.py 37031 1727204396.64685: Sending initial data 37031 1727204396.64691: Sent initial data (152 bytes) 37031 1727204396.67142: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204396.67162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.67198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.67219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.67352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.67371: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204396.67387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.67406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204396.67418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204396.67431: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204396.67448: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.67469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.67486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.67499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.67511: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204396.67526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.67648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204396.67721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.67742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.67818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.69526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204396.69566: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204396.69604: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpu7mzeski /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/AnsiballZ_stat.py <<< 37031 1727204396.69643: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204396.71067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.71143: stderr chunk (state=3): >>><<< 37031 1727204396.71147: stdout chunk (state=3): >>><<< 37031 1727204396.71168: done transferring module to remote 37031 1727204396.71180: _low_level_execute_command(): starting 37031 1727204396.71184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/ /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/AnsiballZ_stat.py && sleep 0' 37031 1727204396.72619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.72623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.73408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204396.73412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.73428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.73436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.73509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204396.73523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.73528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.73597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.75377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.75382: stderr chunk (state=3): >>><<< 37031 1727204396.75387: stdout chunk (state=3): >>><<< 37031 1727204396.75408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204396.75412: _low_level_execute_command(): starting 37031 1727204396.75415: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/AnsiballZ_stat.py && sleep 0' 37031 1727204396.76289: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204396.76305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.76319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.76336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.76392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.76406: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204396.76421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.76441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204396.76453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204396.76472: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204396.76485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.76502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.76518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.76529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.76539: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204396.76552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.76637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204396.76691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.76709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.76796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.89983: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 37031 1727204396.91028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204396.91032: stdout chunk (state=3): >>><<< 37031 1727204396.91035: stderr chunk (state=3): >>><<< 37031 1727204396.91174: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204396.91179: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204396.91183: _low_level_execute_command(): starting 37031 1727204396.91185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204396.5973685-38230-96329570462907/ > /dev/null 2>&1 && sleep 0' 37031 1727204396.93576: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204396.94787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.94802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.94817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.94873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.94884: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204396.94898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.94913: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204396.94922: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204396.94931: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204396.94941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204396.94951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204396.94973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204396.94985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204396.94994: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204396.95005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204396.95080: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204396.95101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204396.95114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204396.95184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204396.97080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204396.97083: stdout chunk (state=3): >>><<< 37031 1727204396.97086: stderr chunk (state=3): >>><<< 37031 1727204396.97169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204396.97173: handler run complete 37031 1727204396.97175: attempt loop complete, returning result 37031 1727204396.97177: _execute() done 37031 1727204396.97180: dumping result to json 37031 1727204396.97182: done dumping result, returning 37031 1727204396.97184: done running TaskExecutor() for managed-node2/TASK: Stat profile file [0affcd87-79f5-b754-dfb8-0000000004b1] 37031 1727204396.97186: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b1 37031 1727204396.97454: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b1 37031 1727204396.97461: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 37031 1727204396.97528: no more pending results, returning what we have 37031 1727204396.97531: results queue empty 37031 1727204396.97532: checking for any_errors_fatal 37031 1727204396.97537: done checking for any_errors_fatal 37031 1727204396.97538: checking for max_fail_percentage 37031 1727204396.97539: done checking for max_fail_percentage 37031 1727204396.97540: checking to see if all hosts have failed and the running result is not ok 37031 1727204396.97541: done checking to see if all hosts have failed 37031 1727204396.97542: getting the remaining hosts for this loop 37031 1727204396.97543: done getting the remaining hosts for this loop 37031 1727204396.97546: getting the next task for host managed-node2 37031 1727204396.97552: done getting next task for host managed-node2 37031 1727204396.97554: ^ task is: TASK: Set NM profile exist flag based on the profile files 37031 1727204396.97558: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204396.97562: getting variables 37031 1727204396.97563: in VariableManager get_vars() 37031 1727204396.97604: Calling all_inventory to load vars for managed-node2 37031 1727204396.97606: Calling groups_inventory to load vars for managed-node2 37031 1727204396.97608: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204396.97618: Calling all_plugins_play to load vars for managed-node2 37031 1727204396.97620: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204396.97623: Calling groups_plugins_play to load vars for managed-node2 37031 1727204396.99705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204397.05326: done with get_vars() 37031 1727204397.05352: done getting variables 37031 1727204397.05416: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:59:57 -0400 (0:00:00.536) 0:00:19.599 ***** 37031 1727204397.05447: entering _queue_task() for managed-node2/set_fact 37031 1727204397.06644: worker is 1 (out of 1 available) 37031 1727204397.06656: exiting _queue_task() for managed-node2/set_fact 37031 1727204397.06672: done queuing things up, now waiting for results queue to drain 37031 1727204397.06673: waiting for pending results... 37031 1727204397.07370: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 37031 1727204397.07500: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b2 37031 1727204397.07518: variable 'ansible_search_path' from source: unknown 37031 1727204397.07525: variable 'ansible_search_path' from source: unknown 37031 1727204397.07572: calling self._execute() 37031 1727204397.07802: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.07931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.07946: variable 'omit' from source: magic vars 37031 1727204397.08737: variable 'ansible_distribution_major_version' from source: facts 37031 1727204397.08759: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204397.09040: variable 'profile_stat' from source: set_fact 37031 1727204397.09131: Evaluated conditional (profile_stat.stat.exists): False 37031 1727204397.09139: when evaluation is False, skipping this task 37031 1727204397.09147: _execute() done 37031 1727204397.09155: dumping result to json 37031 1727204397.09168: done dumping result, returning 37031 1727204397.09179: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-b754-dfb8-0000000004b2] 37031 1727204397.09188: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b2 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 37031 1727204397.09349: no more pending results, returning what we have 37031 1727204397.09354: results queue empty 37031 1727204397.09355: checking for any_errors_fatal 37031 1727204397.09366: done checking for any_errors_fatal 37031 1727204397.09367: checking for max_fail_percentage 37031 1727204397.09369: done checking for max_fail_percentage 37031 1727204397.09371: checking to see if all hosts have failed and the running result is not ok 37031 1727204397.09372: done checking to see if all hosts have failed 37031 1727204397.09372: getting the remaining hosts for this loop 37031 1727204397.09374: done getting the remaining hosts for this loop 37031 1727204397.09379: getting the next task for host managed-node2 37031 1727204397.09387: done getting next task for host managed-node2 37031 1727204397.09390: ^ task is: TASK: Get NM profile info 37031 1727204397.09395: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204397.09400: getting variables 37031 1727204397.09402: in VariableManager get_vars() 37031 1727204397.09451: Calling all_inventory to load vars for managed-node2 37031 1727204397.09455: Calling groups_inventory to load vars for managed-node2 37031 1727204397.09457: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204397.09473: Calling all_plugins_play to load vars for managed-node2 37031 1727204397.09475: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204397.09479: Calling groups_plugins_play to load vars for managed-node2 37031 1727204397.10672: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b2 37031 1727204397.10677: WORKER PROCESS EXITING 37031 1727204397.12256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204397.14640: done with get_vars() 37031 1727204397.15572: done getting variables 37031 1727204397.15639: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:59:57 -0400 (0:00:00.102) 0:00:19.701 ***** 37031 1727204397.15677: entering _queue_task() for managed-node2/shell 37031 1727204397.16024: worker is 1 (out of 1 available) 37031 1727204397.16040: exiting _queue_task() for managed-node2/shell 37031 1727204397.16053: done queuing things up, now waiting for results queue to drain 37031 1727204397.16054: waiting for pending results... 37031 1727204397.17646: running TaskExecutor() for managed-node2/TASK: Get NM profile info 37031 1727204397.18092: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b3 37031 1727204397.18113: variable 'ansible_search_path' from source: unknown 37031 1727204397.18121: variable 'ansible_search_path' from source: unknown 37031 1727204397.18169: calling self._execute() 37031 1727204397.18267: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.18280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.18295: variable 'omit' from source: magic vars 37031 1727204397.19074: variable 'ansible_distribution_major_version' from source: facts 37031 1727204397.19092: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204397.19102: variable 'omit' from source: magic vars 37031 1727204397.19153: variable 'omit' from source: magic vars 37031 1727204397.19575: variable 'profile' from source: include params 37031 1727204397.19586: variable 'interface' from source: play vars 37031 1727204397.19660: variable 'interface' from source: play vars 37031 1727204397.19686: variable 'omit' from source: magic vars 37031 1727204397.19736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204397.19783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204397.19890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204397.19912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204397.19927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204397.19965: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204397.20447: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.20459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.20565: Set connection var ansible_connection to ssh 37031 1727204397.20575: Set connection var ansible_shell_type to sh 37031 1727204397.20589: Set connection var ansible_pipelining to False 37031 1727204397.20603: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204397.20613: Set connection var ansible_timeout to 10 37031 1727204397.20623: Set connection var ansible_shell_executable to /bin/sh 37031 1727204397.20658: variable 'ansible_shell_executable' from source: unknown 37031 1727204397.20670: variable 'ansible_connection' from source: unknown 37031 1727204397.20679: variable 'ansible_module_compression' from source: unknown 37031 1727204397.20686: variable 'ansible_shell_type' from source: unknown 37031 1727204397.20693: variable 'ansible_shell_executable' from source: unknown 37031 1727204397.20701: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.20709: variable 'ansible_pipelining' from source: unknown 37031 1727204397.20716: variable 'ansible_timeout' from source: unknown 37031 1727204397.20725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.20882: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204397.20900: variable 'omit' from source: magic vars 37031 1727204397.20910: starting attempt loop 37031 1727204397.20918: running the handler 37031 1727204397.20933: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204397.20959: _low_level_execute_command(): starting 37031 1727204397.20975: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204397.22746: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.22750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.22775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.22780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.22899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204397.22902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.22982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.22990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.23115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.23350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.24902: stdout chunk (state=3): >>>/root <<< 37031 1727204397.25091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204397.25094: stdout chunk (state=3): >>><<< 37031 1727204397.25106: stderr chunk (state=3): >>><<< 37031 1727204397.25133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204397.25149: _low_level_execute_command(): starting 37031 1727204397.25155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737 `" && echo ansible-tmp-1727204397.2513423-38272-104940584939737="` echo /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737 `" ) && sleep 0' 37031 1727204397.27281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204397.27308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.27313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.27338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.27397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.27443: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204397.27457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.27481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204397.27488: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204397.27498: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204397.27517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.27559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.27575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.27585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.27597: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204397.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.27894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.27925: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.28006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.28218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.30077: stdout chunk (state=3): >>>ansible-tmp-1727204397.2513423-38272-104940584939737=/root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737 <<< 37031 1727204397.30262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204397.30268: stdout chunk (state=3): >>><<< 37031 1727204397.30280: stderr chunk (state=3): >>><<< 37031 1727204397.30302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204397.2513423-38272-104940584939737=/root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204397.30337: variable 'ansible_module_compression' from source: unknown 37031 1727204397.30396: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204397.30435: variable 'ansible_facts' from source: unknown 37031 1727204397.30508: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/AnsiballZ_command.py 37031 1727204397.31162: Sending initial data 37031 1727204397.31172: Sent initial data (156 bytes) 37031 1727204397.33561: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204397.33646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.33661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.33671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.33711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.33719: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204397.33728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.33743: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204397.33751: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204397.33762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204397.33771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.33781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.33884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.33892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.33899: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204397.33908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.33984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.34000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.34003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.34336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.36166: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204397.36196: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204397.36237: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpy8vyv32h /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/AnsiballZ_command.py <<< 37031 1727204397.36274: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204397.37650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204397.37786: stderr chunk (state=3): >>><<< 37031 1727204397.37790: stdout chunk (state=3): >>><<< 37031 1727204397.37792: done transferring module to remote 37031 1727204397.37798: _low_level_execute_command(): starting 37031 1727204397.37801: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/ /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/AnsiballZ_command.py && sleep 0' 37031 1727204397.39351: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204397.39361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.39374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.39388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.39427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.39436: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204397.39448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.39530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204397.39536: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204397.39547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204397.39559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.39566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.39579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.39586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.39593: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204397.39602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.39678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.39692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.39783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.39849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.41776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204397.41780: stdout chunk (state=3): >>><<< 37031 1727204397.41783: stderr chunk (state=3): >>><<< 37031 1727204397.41873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204397.41878: _low_level_execute_command(): starting 37031 1727204397.41881: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/AnsiballZ_command.py && sleep 0' 37031 1727204397.43925: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.43929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.43952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.43958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.43961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.44148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.44195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.44211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.44426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.59928: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:59:57.578251", "end": "2024-09-24 14:59:57.598236", "delta": "0:00:00.019985", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204397.61291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204397.61351: stderr chunk (state=3): >>><<< 37031 1727204397.61355: stdout chunk (state=3): >>><<< 37031 1727204397.61507: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-24 14:59:57.578251", "end": "2024-09-24 14:59:57.598236", "delta": "0:00:00.019985", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204397.61516: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204397.61518: _low_level_execute_command(): starting 37031 1727204397.61520: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204397.2513423-38272-104940584939737/ > /dev/null 2>&1 && sleep 0' 37031 1727204397.63143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204397.63268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.63285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.63304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.63403: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.63418: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204397.63434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.63452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204397.63475: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204397.63489: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204397.63502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204397.63595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204397.63611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204397.63623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204397.63636: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204397.63651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204397.63737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204397.63819: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204397.63837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204397.63924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204397.65881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204397.65884: stdout chunk (state=3): >>><<< 37031 1727204397.65887: stderr chunk (state=3): >>><<< 37031 1727204397.65973: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204397.65976: handler run complete 37031 1727204397.65979: Evaluated conditional (False): False 37031 1727204397.65981: attempt loop complete, returning result 37031 1727204397.65982: _execute() done 37031 1727204397.65984: dumping result to json 37031 1727204397.65986: done dumping result, returning 37031 1727204397.65988: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [0affcd87-79f5-b754-dfb8-0000000004b3] 37031 1727204397.66173: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b3 37031 1727204397.66247: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b3 37031 1727204397.66250: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.019985", "end": "2024-09-24 14:59:57.598236", "rc": 0, "start": "2024-09-24 14:59:57.578251" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 37031 1727204397.66347: no more pending results, returning what we have 37031 1727204397.66352: results queue empty 37031 1727204397.66353: checking for any_errors_fatal 37031 1727204397.66363: done checking for any_errors_fatal 37031 1727204397.66372: checking for max_fail_percentage 37031 1727204397.66375: done checking for max_fail_percentage 37031 1727204397.66376: checking to see if all hosts have failed and the running result is not ok 37031 1727204397.66377: done checking to see if all hosts have failed 37031 1727204397.66377: getting the remaining hosts for this loop 37031 1727204397.66379: done getting the remaining hosts for this loop 37031 1727204397.66384: getting the next task for host managed-node2 37031 1727204397.66391: done getting next task for host managed-node2 37031 1727204397.66394: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 37031 1727204397.66398: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204397.66402: getting variables 37031 1727204397.66404: in VariableManager get_vars() 37031 1727204397.66449: Calling all_inventory to load vars for managed-node2 37031 1727204397.66451: Calling groups_inventory to load vars for managed-node2 37031 1727204397.66454: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204397.66469: Calling all_plugins_play to load vars for managed-node2 37031 1727204397.66472: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204397.66475: Calling groups_plugins_play to load vars for managed-node2 37031 1727204397.70592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204397.74806: done with get_vars() 37031 1727204397.74841: done getting variables 37031 1727204397.74911: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:59:57 -0400 (0:00:00.592) 0:00:20.294 ***** 37031 1727204397.74943: entering _queue_task() for managed-node2/set_fact 37031 1727204397.75987: worker is 1 (out of 1 available) 37031 1727204397.76001: exiting _queue_task() for managed-node2/set_fact 37031 1727204397.76017: done queuing things up, now waiting for results queue to drain 37031 1727204397.76018: waiting for pending results... 37031 1727204397.77298: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 37031 1727204397.77760: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b4 37031 1727204397.77871: variable 'ansible_search_path' from source: unknown 37031 1727204397.77881: variable 'ansible_search_path' from source: unknown 37031 1727204397.77924: calling self._execute() 37031 1727204397.78061: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.78300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.78317: variable 'omit' from source: magic vars 37031 1727204397.79552: variable 'ansible_distribution_major_version' from source: facts 37031 1727204397.79576: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204397.79956: variable 'nm_profile_exists' from source: set_fact 37031 1727204397.80052: Evaluated conditional (nm_profile_exists.rc == 0): True 37031 1727204397.80066: variable 'omit' from source: magic vars 37031 1727204397.80118: variable 'omit' from source: magic vars 37031 1727204397.80206: variable 'omit' from source: magic vars 37031 1727204397.80295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204397.80399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204397.80493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204397.80515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204397.80530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204397.80604: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204397.80698: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.80707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.80919: Set connection var ansible_connection to ssh 37031 1727204397.80927: Set connection var ansible_shell_type to sh 37031 1727204397.80939: Set connection var ansible_pipelining to False 37031 1727204397.80951: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204397.80959: Set connection var ansible_timeout to 10 37031 1727204397.80971: Set connection var ansible_shell_executable to /bin/sh 37031 1727204397.81001: variable 'ansible_shell_executable' from source: unknown 37031 1727204397.81008: variable 'ansible_connection' from source: unknown 37031 1727204397.81022: variable 'ansible_module_compression' from source: unknown 37031 1727204397.81029: variable 'ansible_shell_type' from source: unknown 37031 1727204397.81036: variable 'ansible_shell_executable' from source: unknown 37031 1727204397.81132: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.81140: variable 'ansible_pipelining' from source: unknown 37031 1727204397.81147: variable 'ansible_timeout' from source: unknown 37031 1727204397.81155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.81340: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204397.81473: variable 'omit' from source: magic vars 37031 1727204397.81483: starting attempt loop 37031 1727204397.81489: running the handler 37031 1727204397.81506: handler run complete 37031 1727204397.81519: attempt loop complete, returning result 37031 1727204397.81525: _execute() done 37031 1727204397.81531: dumping result to json 37031 1727204397.81538: done dumping result, returning 37031 1727204397.81549: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-b754-dfb8-0000000004b4] 37031 1727204397.81574: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b4 37031 1727204397.81773: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b4 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 37031 1727204397.81835: no more pending results, returning what we have 37031 1727204397.81839: results queue empty 37031 1727204397.81840: checking for any_errors_fatal 37031 1727204397.81848: done checking for any_errors_fatal 37031 1727204397.81849: checking for max_fail_percentage 37031 1727204397.81851: done checking for max_fail_percentage 37031 1727204397.81852: checking to see if all hosts have failed and the running result is not ok 37031 1727204397.81853: done checking to see if all hosts have failed 37031 1727204397.81854: getting the remaining hosts for this loop 37031 1727204397.81855: done getting the remaining hosts for this loop 37031 1727204397.81860: getting the next task for host managed-node2 37031 1727204397.81872: done getting next task for host managed-node2 37031 1727204397.81875: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 37031 1727204397.81879: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204397.81884: getting variables 37031 1727204397.81888: in VariableManager get_vars() 37031 1727204397.81935: Calling all_inventory to load vars for managed-node2 37031 1727204397.81938: Calling groups_inventory to load vars for managed-node2 37031 1727204397.81940: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204397.81951: Calling all_plugins_play to load vars for managed-node2 37031 1727204397.81954: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204397.81957: Calling groups_plugins_play to load vars for managed-node2 37031 1727204397.83044: WORKER PROCESS EXITING 37031 1727204397.84726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204397.88207: done with get_vars() 37031 1727204397.88235: done getting variables 37031 1727204397.88303: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204397.88430: variable 'profile' from source: include params 37031 1727204397.88436: variable 'interface' from source: play vars 37031 1727204397.88514: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:59:57 -0400 (0:00:00.136) 0:00:20.430 ***** 37031 1727204397.88553: entering _queue_task() for managed-node2/command 37031 1727204397.88883: worker is 1 (out of 1 available) 37031 1727204397.88895: exiting _queue_task() for managed-node2/command 37031 1727204397.88906: done queuing things up, now waiting for results queue to drain 37031 1727204397.88908: waiting for pending results... 37031 1727204397.89199: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-veth0 37031 1727204397.89325: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b6 37031 1727204397.89342: variable 'ansible_search_path' from source: unknown 37031 1727204397.89356: variable 'ansible_search_path' from source: unknown 37031 1727204397.89395: calling self._execute() 37031 1727204397.89497: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.89510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.89523: variable 'omit' from source: magic vars 37031 1727204397.89904: variable 'ansible_distribution_major_version' from source: facts 37031 1727204397.89925: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204397.90061: variable 'profile_stat' from source: set_fact 37031 1727204397.90083: Evaluated conditional (profile_stat.stat.exists): False 37031 1727204397.90090: when evaluation is False, skipping this task 37031 1727204397.90098: _execute() done 37031 1727204397.90106: dumping result to json 37031 1727204397.90120: done dumping result, returning 37031 1727204397.90134: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-b754-dfb8-0000000004b6] 37031 1727204397.90144: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b6 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 37031 1727204397.90294: no more pending results, returning what we have 37031 1727204397.90299: results queue empty 37031 1727204397.90300: checking for any_errors_fatal 37031 1727204397.90308: done checking for any_errors_fatal 37031 1727204397.90308: checking for max_fail_percentage 37031 1727204397.90310: done checking for max_fail_percentage 37031 1727204397.90311: checking to see if all hosts have failed and the running result is not ok 37031 1727204397.90312: done checking to see if all hosts have failed 37031 1727204397.90313: getting the remaining hosts for this loop 37031 1727204397.90314: done getting the remaining hosts for this loop 37031 1727204397.90319: getting the next task for host managed-node2 37031 1727204397.90326: done getting next task for host managed-node2 37031 1727204397.90328: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 37031 1727204397.90333: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204397.90337: getting variables 37031 1727204397.90339: in VariableManager get_vars() 37031 1727204397.90390: Calling all_inventory to load vars for managed-node2 37031 1727204397.90393: Calling groups_inventory to load vars for managed-node2 37031 1727204397.90395: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204397.90408: Calling all_plugins_play to load vars for managed-node2 37031 1727204397.90410: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204397.90413: Calling groups_plugins_play to load vars for managed-node2 37031 1727204397.91451: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b6 37031 1727204397.91456: WORKER PROCESS EXITING 37031 1727204397.92337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204397.94054: done with get_vars() 37031 1727204397.94086: done getting variables 37031 1727204397.94147: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204397.94267: variable 'profile' from source: include params 37031 1727204397.94271: variable 'interface' from source: play vars 37031 1727204397.94337: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:59:57 -0400 (0:00:00.058) 0:00:20.488 ***** 37031 1727204397.94371: entering _queue_task() for managed-node2/set_fact 37031 1727204397.94701: worker is 1 (out of 1 available) 37031 1727204397.94717: exiting _queue_task() for managed-node2/set_fact 37031 1727204397.94729: done queuing things up, now waiting for results queue to drain 37031 1727204397.94731: waiting for pending results... 37031 1727204397.95024: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 37031 1727204397.95162: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b7 37031 1727204397.95185: variable 'ansible_search_path' from source: unknown 37031 1727204397.95192: variable 'ansible_search_path' from source: unknown 37031 1727204397.95231: calling self._execute() 37031 1727204397.95334: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204397.95346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204397.95361: variable 'omit' from source: magic vars 37031 1727204397.95757: variable 'ansible_distribution_major_version' from source: facts 37031 1727204397.95778: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204397.95913: variable 'profile_stat' from source: set_fact 37031 1727204397.95935: Evaluated conditional (profile_stat.stat.exists): False 37031 1727204397.95943: when evaluation is False, skipping this task 37031 1727204397.95949: _execute() done 37031 1727204397.95956: dumping result to json 37031 1727204397.95962: done dumping result, returning 37031 1727204397.95975: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 [0affcd87-79f5-b754-dfb8-0000000004b7] 37031 1727204397.95983: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b7 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 37031 1727204397.96131: no more pending results, returning what we have 37031 1727204397.96135: results queue empty 37031 1727204397.96136: checking for any_errors_fatal 37031 1727204397.96143: done checking for any_errors_fatal 37031 1727204397.96144: checking for max_fail_percentage 37031 1727204397.96146: done checking for max_fail_percentage 37031 1727204397.96147: checking to see if all hosts have failed and the running result is not ok 37031 1727204397.96148: done checking to see if all hosts have failed 37031 1727204397.96149: getting the remaining hosts for this loop 37031 1727204397.96150: done getting the remaining hosts for this loop 37031 1727204397.96154: getting the next task for host managed-node2 37031 1727204397.96163: done getting next task for host managed-node2 37031 1727204397.96167: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 37031 1727204397.96172: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204397.96177: getting variables 37031 1727204397.96179: in VariableManager get_vars() 37031 1727204397.96226: Calling all_inventory to load vars for managed-node2 37031 1727204397.96229: Calling groups_inventory to load vars for managed-node2 37031 1727204397.96232: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204397.96246: Calling all_plugins_play to load vars for managed-node2 37031 1727204397.96248: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204397.96251: Calling groups_plugins_play to load vars for managed-node2 37031 1727204397.97284: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b7 37031 1727204397.97288: WORKER PROCESS EXITING 37031 1727204397.98130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.01697: done with get_vars() 37031 1727204398.01732: done getting variables 37031 1727204398.01938: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204398.02112: variable 'profile' from source: include params 37031 1727204398.02116: variable 'interface' from source: play vars 37031 1727204398.02297: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.079) 0:00:20.568 ***** 37031 1727204398.02329: entering _queue_task() for managed-node2/command 37031 1727204398.03124: worker is 1 (out of 1 available) 37031 1727204398.03137: exiting _queue_task() for managed-node2/command 37031 1727204398.03151: done queuing things up, now waiting for results queue to drain 37031 1727204398.03153: waiting for pending results... 37031 1727204398.04038: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-veth0 37031 1727204398.04276: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b8 37031 1727204398.04311: variable 'ansible_search_path' from source: unknown 37031 1727204398.04414: variable 'ansible_search_path' from source: unknown 37031 1727204398.04459: calling self._execute() 37031 1727204398.04583: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.04648: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.04758: variable 'omit' from source: magic vars 37031 1727204398.05445: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.05465: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.05710: variable 'profile_stat' from source: set_fact 37031 1727204398.05843: Evaluated conditional (profile_stat.stat.exists): False 37031 1727204398.05851: when evaluation is False, skipping this task 37031 1727204398.05858: _execute() done 37031 1727204398.05868: dumping result to json 37031 1727204398.05877: done dumping result, returning 37031 1727204398.05888: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-b754-dfb8-0000000004b8] 37031 1727204398.05897: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b8 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 37031 1727204398.06051: no more pending results, returning what we have 37031 1727204398.06056: results queue empty 37031 1727204398.06057: checking for any_errors_fatal 37031 1727204398.06067: done checking for any_errors_fatal 37031 1727204398.06068: checking for max_fail_percentage 37031 1727204398.06071: done checking for max_fail_percentage 37031 1727204398.06071: checking to see if all hosts have failed and the running result is not ok 37031 1727204398.06072: done checking to see if all hosts have failed 37031 1727204398.06073: getting the remaining hosts for this loop 37031 1727204398.06075: done getting the remaining hosts for this loop 37031 1727204398.06079: getting the next task for host managed-node2 37031 1727204398.06087: done getting next task for host managed-node2 37031 1727204398.06090: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 37031 1727204398.06094: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204398.06099: getting variables 37031 1727204398.06101: in VariableManager get_vars() 37031 1727204398.06149: Calling all_inventory to load vars for managed-node2 37031 1727204398.06152: Calling groups_inventory to load vars for managed-node2 37031 1727204398.06155: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204398.06170: Calling all_plugins_play to load vars for managed-node2 37031 1727204398.06173: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204398.06176: Calling groups_plugins_play to load vars for managed-node2 37031 1727204398.07586: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b8 37031 1727204398.07590: WORKER PROCESS EXITING 37031 1727204398.10222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.13860: done with get_vars() 37031 1727204398.13893: done getting variables 37031 1727204398.13958: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204398.14080: variable 'profile' from source: include params 37031 1727204398.14084: variable 'interface' from source: play vars 37031 1727204398.14145: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.122) 0:00:20.690 ***** 37031 1727204398.14582: entering _queue_task() for managed-node2/set_fact 37031 1727204398.15123: worker is 1 (out of 1 available) 37031 1727204398.15136: exiting _queue_task() for managed-node2/set_fact 37031 1727204398.15151: done queuing things up, now waiting for results queue to drain 37031 1727204398.15152: waiting for pending results... 37031 1727204398.16123: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-veth0 37031 1727204398.16392: in run() - task 0affcd87-79f5-b754-dfb8-0000000004b9 37031 1727204398.16405: variable 'ansible_search_path' from source: unknown 37031 1727204398.16409: variable 'ansible_search_path' from source: unknown 37031 1727204398.16448: calling self._execute() 37031 1727204398.16666: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.16672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.16688: variable 'omit' from source: magic vars 37031 1727204398.18019: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.18030: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.18368: variable 'profile_stat' from source: set_fact 37031 1727204398.18381: Evaluated conditional (profile_stat.stat.exists): False 37031 1727204398.18384: when evaluation is False, skipping this task 37031 1727204398.18386: _execute() done 37031 1727204398.18389: dumping result to json 37031 1727204398.18391: done dumping result, returning 37031 1727204398.18399: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-veth0 [0affcd87-79f5-b754-dfb8-0000000004b9] 37031 1727204398.18402: sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b9 37031 1727204398.18505: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000004b9 37031 1727204398.18508: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 37031 1727204398.18568: no more pending results, returning what we have 37031 1727204398.18573: results queue empty 37031 1727204398.18575: checking for any_errors_fatal 37031 1727204398.18581: done checking for any_errors_fatal 37031 1727204398.18581: checking for max_fail_percentage 37031 1727204398.18583: done checking for max_fail_percentage 37031 1727204398.18584: checking to see if all hosts have failed and the running result is not ok 37031 1727204398.18585: done checking to see if all hosts have failed 37031 1727204398.18586: getting the remaining hosts for this loop 37031 1727204398.18588: done getting the remaining hosts for this loop 37031 1727204398.18592: getting the next task for host managed-node2 37031 1727204398.18602: done getting next task for host managed-node2 37031 1727204398.18605: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 37031 1727204398.18608: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204398.18612: getting variables 37031 1727204398.18615: in VariableManager get_vars() 37031 1727204398.18667: Calling all_inventory to load vars for managed-node2 37031 1727204398.18670: Calling groups_inventory to load vars for managed-node2 37031 1727204398.18672: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204398.18685: Calling all_plugins_play to load vars for managed-node2 37031 1727204398.18688: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204398.18691: Calling groups_plugins_play to load vars for managed-node2 37031 1727204398.21440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.25604: done with get_vars() 37031 1727204398.25629: done getting variables 37031 1727204398.25821: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204398.26188: variable 'profile' from source: include params 37031 1727204398.26192: variable 'interface' from source: play vars 37031 1727204398.26444: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.119) 0:00:20.809 ***** 37031 1727204398.26495: entering _queue_task() for managed-node2/assert 37031 1727204398.27253: worker is 1 (out of 1 available) 37031 1727204398.27266: exiting _queue_task() for managed-node2/assert 37031 1727204398.27280: done queuing things up, now waiting for results queue to drain 37031 1727204398.27282: waiting for pending results... 37031 1727204398.29222: running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'veth0' 37031 1727204398.29794: in run() - task 0affcd87-79f5-b754-dfb8-0000000003b9 37031 1727204398.29818: variable 'ansible_search_path' from source: unknown 37031 1727204398.29826: variable 'ansible_search_path' from source: unknown 37031 1727204398.30094: calling self._execute() 37031 1727204398.30245: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.30336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.30352: variable 'omit' from source: magic vars 37031 1727204398.31085: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.31186: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.31204: variable 'omit' from source: magic vars 37031 1727204398.31256: variable 'omit' from source: magic vars 37031 1727204398.31480: variable 'profile' from source: include params 37031 1727204398.31545: variable 'interface' from source: play vars 37031 1727204398.31723: variable 'interface' from source: play vars 37031 1727204398.31878: variable 'omit' from source: magic vars 37031 1727204398.32010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204398.32108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204398.32207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204398.32315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.32333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.32369: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204398.32617: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.32631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.32953: Set connection var ansible_connection to ssh 37031 1727204398.32961: Set connection var ansible_shell_type to sh 37031 1727204398.32979: Set connection var ansible_pipelining to False 37031 1727204398.32992: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204398.33001: Set connection var ansible_timeout to 10 37031 1727204398.33010: Set connection var ansible_shell_executable to /bin/sh 37031 1727204398.33040: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.33052: variable 'ansible_connection' from source: unknown 37031 1727204398.33063: variable 'ansible_module_compression' from source: unknown 37031 1727204398.33072: variable 'ansible_shell_type' from source: unknown 37031 1727204398.33166: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.33274: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.33285: variable 'ansible_pipelining' from source: unknown 37031 1727204398.33292: variable 'ansible_timeout' from source: unknown 37031 1727204398.33299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.33462: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204398.33761: variable 'omit' from source: magic vars 37031 1727204398.33851: starting attempt loop 37031 1727204398.33952: running the handler 37031 1727204398.34408: variable 'lsr_net_profile_exists' from source: set_fact 37031 1727204398.34472: Evaluated conditional (lsr_net_profile_exists): True 37031 1727204398.34508: handler run complete 37031 1727204398.34528: attempt loop complete, returning result 37031 1727204398.34534: _execute() done 37031 1727204398.34576: dumping result to json 37031 1727204398.34585: done dumping result, returning 37031 1727204398.34595: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is present - 'veth0' [0affcd87-79f5-b754-dfb8-0000000003b9] 37031 1727204398.34612: sending task result for task 0affcd87-79f5-b754-dfb8-0000000003b9 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204398.34773: no more pending results, returning what we have 37031 1727204398.34776: results queue empty 37031 1727204398.34777: checking for any_errors_fatal 37031 1727204398.34786: done checking for any_errors_fatal 37031 1727204398.34787: checking for max_fail_percentage 37031 1727204398.34789: done checking for max_fail_percentage 37031 1727204398.34790: checking to see if all hosts have failed and the running result is not ok 37031 1727204398.34791: done checking to see if all hosts have failed 37031 1727204398.34792: getting the remaining hosts for this loop 37031 1727204398.34794: done getting the remaining hosts for this loop 37031 1727204398.34799: getting the next task for host managed-node2 37031 1727204398.34806: done getting next task for host managed-node2 37031 1727204398.34809: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 37031 1727204398.34812: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204398.34817: getting variables 37031 1727204398.34820: in VariableManager get_vars() 37031 1727204398.34874: Calling all_inventory to load vars for managed-node2 37031 1727204398.34877: Calling groups_inventory to load vars for managed-node2 37031 1727204398.34879: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204398.34892: Calling all_plugins_play to load vars for managed-node2 37031 1727204398.34894: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204398.34897: Calling groups_plugins_play to load vars for managed-node2 37031 1727204398.36162: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000003b9 37031 1727204398.36168: WORKER PROCESS EXITING 37031 1727204398.40310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.43222: done with get_vars() 37031 1727204398.43255: done getting variables 37031 1727204398.43721: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204398.44482: variable 'profile' from source: include params 37031 1727204398.44486: variable 'interface' from source: play vars 37031 1727204398.44966: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.185) 0:00:20.995 ***** 37031 1727204398.45093: entering _queue_task() for managed-node2/assert 37031 1727204398.46325: worker is 1 (out of 1 available) 37031 1727204398.46337: exiting _queue_task() for managed-node2/assert 37031 1727204398.46350: done queuing things up, now waiting for results queue to drain 37031 1727204398.46351: waiting for pending results... 37031 1727204398.47228: running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'veth0' 37031 1727204398.47576: in run() - task 0affcd87-79f5-b754-dfb8-0000000003ba 37031 1727204398.47594: variable 'ansible_search_path' from source: unknown 37031 1727204398.47597: variable 'ansible_search_path' from source: unknown 37031 1727204398.47637: calling self._execute() 37031 1727204398.47845: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.47848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.47864: variable 'omit' from source: magic vars 37031 1727204398.48885: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.48900: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.48903: variable 'omit' from source: magic vars 37031 1727204398.48957: variable 'omit' from source: magic vars 37031 1727204398.49070: variable 'profile' from source: include params 37031 1727204398.49074: variable 'interface' from source: play vars 37031 1727204398.49330: variable 'interface' from source: play vars 37031 1727204398.49393: variable 'omit' from source: magic vars 37031 1727204398.49539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204398.49649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204398.49689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204398.49781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.49793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.49823: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204398.49826: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.49828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.50184: Set connection var ansible_connection to ssh 37031 1727204398.50187: Set connection var ansible_shell_type to sh 37031 1727204398.50247: Set connection var ansible_pipelining to False 37031 1727204398.50250: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204398.50260: Set connection var ansible_timeout to 10 37031 1727204398.50315: Set connection var ansible_shell_executable to /bin/sh 37031 1727204398.50395: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.50398: variable 'ansible_connection' from source: unknown 37031 1727204398.50401: variable 'ansible_module_compression' from source: unknown 37031 1727204398.50403: variable 'ansible_shell_type' from source: unknown 37031 1727204398.50405: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.50407: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.50476: variable 'ansible_pipelining' from source: unknown 37031 1727204398.50505: variable 'ansible_timeout' from source: unknown 37031 1727204398.50508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.50923: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204398.50968: variable 'omit' from source: magic vars 37031 1727204398.51078: starting attempt loop 37031 1727204398.51082: running the handler 37031 1727204398.51280: variable 'lsr_net_profile_ansible_managed' from source: set_fact 37031 1727204398.51284: Evaluated conditional (lsr_net_profile_ansible_managed): True 37031 1727204398.51292: handler run complete 37031 1727204398.51366: attempt loop complete, returning result 37031 1727204398.51372: _execute() done 37031 1727204398.51375: dumping result to json 37031 1727204398.51377: done dumping result, returning 37031 1727204398.51418: done running TaskExecutor() for managed-node2/TASK: Assert that the ansible managed comment is present in 'veth0' [0affcd87-79f5-b754-dfb8-0000000003ba] 37031 1727204398.51455: sending task result for task 0affcd87-79f5-b754-dfb8-0000000003ba 37031 1727204398.51623: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000003ba 37031 1727204398.51627: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204398.51692: no more pending results, returning what we have 37031 1727204398.51696: results queue empty 37031 1727204398.51697: checking for any_errors_fatal 37031 1727204398.51703: done checking for any_errors_fatal 37031 1727204398.51704: checking for max_fail_percentage 37031 1727204398.51708: done checking for max_fail_percentage 37031 1727204398.51709: checking to see if all hosts have failed and the running result is not ok 37031 1727204398.51710: done checking to see if all hosts have failed 37031 1727204398.51710: getting the remaining hosts for this loop 37031 1727204398.51712: done getting the remaining hosts for this loop 37031 1727204398.51720: getting the next task for host managed-node2 37031 1727204398.51727: done getting next task for host managed-node2 37031 1727204398.51731: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 37031 1727204398.51734: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204398.51738: getting variables 37031 1727204398.51743: in VariableManager get_vars() 37031 1727204398.51803: Calling all_inventory to load vars for managed-node2 37031 1727204398.51806: Calling groups_inventory to load vars for managed-node2 37031 1727204398.51809: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204398.51821: Calling all_plugins_play to load vars for managed-node2 37031 1727204398.51824: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204398.51828: Calling groups_plugins_play to load vars for managed-node2 37031 1727204398.56766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.61330: done with get_vars() 37031 1727204398.61454: done getting variables 37031 1727204398.61673: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204398.61835: variable 'profile' from source: include params 37031 1727204398.61839: variable 'interface' from source: play vars 37031 1727204398.62018: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.169) 0:00:21.165 ***** 37031 1727204398.62057: entering _queue_task() for managed-node2/assert 37031 1727204398.62876: worker is 1 (out of 1 available) 37031 1727204398.62888: exiting _queue_task() for managed-node2/assert 37031 1727204398.62903: done queuing things up, now waiting for results queue to drain 37031 1727204398.62905: waiting for pending results... 37031 1727204398.63703: running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in veth0 37031 1727204398.63946: in run() - task 0affcd87-79f5-b754-dfb8-0000000003bb 37031 1727204398.64086: variable 'ansible_search_path' from source: unknown 37031 1727204398.64096: variable 'ansible_search_path' from source: unknown 37031 1727204398.64138: calling self._execute() 37031 1727204398.64310: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.64362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.64402: variable 'omit' from source: magic vars 37031 1727204398.65148: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.65236: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.65319: variable 'omit' from source: magic vars 37031 1727204398.65363: variable 'omit' from source: magic vars 37031 1727204398.65610: variable 'profile' from source: include params 37031 1727204398.65647: variable 'interface' from source: play vars 37031 1727204398.65778: variable 'interface' from source: play vars 37031 1727204398.65936: variable 'omit' from source: magic vars 37031 1727204398.65985: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204398.66057: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204398.66087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204398.66153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.66203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.66306: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204398.66336: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.66356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.66556: Set connection var ansible_connection to ssh 37031 1727204398.66575: Set connection var ansible_shell_type to sh 37031 1727204398.66605: Set connection var ansible_pipelining to False 37031 1727204398.66638: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204398.66666: Set connection var ansible_timeout to 10 37031 1727204398.66783: Set connection var ansible_shell_executable to /bin/sh 37031 1727204398.66821: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.66829: variable 'ansible_connection' from source: unknown 37031 1727204398.66837: variable 'ansible_module_compression' from source: unknown 37031 1727204398.66844: variable 'ansible_shell_type' from source: unknown 37031 1727204398.66851: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.66858: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.66875: variable 'ansible_pipelining' from source: unknown 37031 1727204398.66884: variable 'ansible_timeout' from source: unknown 37031 1727204398.66893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.67150: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204398.67239: variable 'omit' from source: magic vars 37031 1727204398.67250: starting attempt loop 37031 1727204398.67256: running the handler 37031 1727204398.67486: variable 'lsr_net_profile_fingerprint' from source: set_fact 37031 1727204398.67495: Evaluated conditional (lsr_net_profile_fingerprint): True 37031 1727204398.67505: handler run complete 37031 1727204398.67571: attempt loop complete, returning result 37031 1727204398.67579: _execute() done 37031 1727204398.67586: dumping result to json 37031 1727204398.67594: done dumping result, returning 37031 1727204398.67603: done running TaskExecutor() for managed-node2/TASK: Assert that the fingerprint comment is present in veth0 [0affcd87-79f5-b754-dfb8-0000000003bb] 37031 1727204398.67611: sending task result for task 0affcd87-79f5-b754-dfb8-0000000003bb 37031 1727204398.67765: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000003bb ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204398.67815: no more pending results, returning what we have 37031 1727204398.67819: results queue empty 37031 1727204398.67820: checking for any_errors_fatal 37031 1727204398.67826: done checking for any_errors_fatal 37031 1727204398.67827: checking for max_fail_percentage 37031 1727204398.67828: done checking for max_fail_percentage 37031 1727204398.67829: checking to see if all hosts have failed and the running result is not ok 37031 1727204398.67830: done checking to see if all hosts have failed 37031 1727204398.67831: getting the remaining hosts for this loop 37031 1727204398.67832: done getting the remaining hosts for this loop 37031 1727204398.67836: getting the next task for host managed-node2 37031 1727204398.67843: done getting next task for host managed-node2 37031 1727204398.67846: ^ task is: TASK: Get ip address information 37031 1727204398.67848: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204398.67852: getting variables 37031 1727204398.67854: in VariableManager get_vars() 37031 1727204398.67907: Calling all_inventory to load vars for managed-node2 37031 1727204398.67910: Calling groups_inventory to load vars for managed-node2 37031 1727204398.67912: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204398.67925: Calling all_plugins_play to load vars for managed-node2 37031 1727204398.67927: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204398.67932: Calling groups_plugins_play to load vars for managed-node2 37031 1727204398.68470: WORKER PROCESS EXITING 37031 1727204398.70252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204398.72272: done with get_vars() 37031 1727204398.72308: done getting variables 37031 1727204398.72377: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Tuesday 24 September 2024 14:59:58 -0400 (0:00:00.103) 0:00:21.268 ***** 37031 1727204398.72413: entering _queue_task() for managed-node2/command 37031 1727204398.72791: worker is 1 (out of 1 available) 37031 1727204398.72805: exiting _queue_task() for managed-node2/command 37031 1727204398.72818: done queuing things up, now waiting for results queue to drain 37031 1727204398.72820: waiting for pending results... 37031 1727204398.73135: running TaskExecutor() for managed-node2/TASK: Get ip address information 37031 1727204398.73247: in run() - task 0affcd87-79f5-b754-dfb8-00000000005e 37031 1727204398.73279: variable 'ansible_search_path' from source: unknown 37031 1727204398.73323: calling self._execute() 37031 1727204398.73440: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.73452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.73474: variable 'omit' from source: magic vars 37031 1727204398.74026: variable 'ansible_distribution_major_version' from source: facts 37031 1727204398.74047: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204398.74061: variable 'omit' from source: magic vars 37031 1727204398.74087: variable 'omit' from source: magic vars 37031 1727204398.74192: variable 'interface' from source: play vars 37031 1727204398.74219: variable 'omit' from source: magic vars 37031 1727204398.74282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204398.74323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204398.74361: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204398.74389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.74407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204398.74443: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204398.74454: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.74470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.74586: Set connection var ansible_connection to ssh 37031 1727204398.74598: Set connection var ansible_shell_type to sh 37031 1727204398.74612: Set connection var ansible_pipelining to False 37031 1727204398.74625: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204398.74635: Set connection var ansible_timeout to 10 37031 1727204398.74644: Set connection var ansible_shell_executable to /bin/sh 37031 1727204398.74689: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.74699: variable 'ansible_connection' from source: unknown 37031 1727204398.74711: variable 'ansible_module_compression' from source: unknown 37031 1727204398.74718: variable 'ansible_shell_type' from source: unknown 37031 1727204398.74725: variable 'ansible_shell_executable' from source: unknown 37031 1727204398.74731: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204398.74738: variable 'ansible_pipelining' from source: unknown 37031 1727204398.74744: variable 'ansible_timeout' from source: unknown 37031 1727204398.74751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204398.74920: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204398.74941: variable 'omit' from source: magic vars 37031 1727204398.74951: starting attempt loop 37031 1727204398.74961: running the handler 37031 1727204398.74983: _low_level_execute_command(): starting 37031 1727204398.74996: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204398.75823: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204398.75846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.75877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.75901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.75959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.75975: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204398.75992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.76012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204398.76026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204398.76036: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204398.76047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.76073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.76094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.76125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.76146: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204398.76159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.76255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204398.76275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204398.76289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204398.76443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204398.78056: stdout chunk (state=3): >>>/root <<< 37031 1727204398.78185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204398.78258: stderr chunk (state=3): >>><<< 37031 1727204398.78262: stdout chunk (state=3): >>><<< 37031 1727204398.78387: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204398.78391: _low_level_execute_command(): starting 37031 1727204398.78394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862 `" && echo ansible-tmp-1727204398.7828872-38376-244015336829862="` echo /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862 `" ) && sleep 0' 37031 1727204398.78984: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204398.78998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.79013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.79036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.79079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.79092: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204398.79107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.79124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204398.79138: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204398.79154: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204398.79169: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.79185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.79201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.79214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.79225: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204398.79241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.79324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204398.79346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204398.79369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204398.79441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204398.81295: stdout chunk (state=3): >>>ansible-tmp-1727204398.7828872-38376-244015336829862=/root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862 <<< 37031 1727204398.81420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204398.81511: stderr chunk (state=3): >>><<< 37031 1727204398.81527: stdout chunk (state=3): >>><<< 37031 1727204398.81570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204398.7828872-38376-244015336829862=/root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204398.82075: variable 'ansible_module_compression' from source: unknown 37031 1727204398.82078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204398.82081: variable 'ansible_facts' from source: unknown 37031 1727204398.82099: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/AnsiballZ_command.py 37031 1727204398.82718: Sending initial data 37031 1727204398.82723: Sent initial data (156 bytes) 37031 1727204398.84558: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204398.84582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.84599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.84621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.84662: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.84678: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204398.84697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.84719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204398.84731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204398.84743: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204398.84756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.84772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.84788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.84801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.84815: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204398.84831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.84905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204398.84931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204398.84952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204398.85020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204398.86777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204398.86853: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204398.86857: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmptfyq634m /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/AnsiballZ_command.py <<< 37031 1727204398.86939: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204398.88082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204398.88169: stderr chunk (state=3): >>><<< 37031 1727204398.88173: stdout chunk (state=3): >>><<< 37031 1727204398.88274: done transferring module to remote 37031 1727204398.88282: _low_level_execute_command(): starting 37031 1727204398.88285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/ /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/AnsiballZ_command.py && sleep 0' 37031 1727204398.88912: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204398.88932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.88946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.88968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.89017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.89100: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204398.89137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.89155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204398.89172: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204398.89189: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204398.89201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.89213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.89228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.89241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.89252: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204398.89271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.89353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204398.89375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204398.89389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204398.89461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204398.91306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204398.91310: stdout chunk (state=3): >>><<< 37031 1727204398.91312: stderr chunk (state=3): >>><<< 37031 1727204398.91315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204398.91317: _low_level_execute_command(): starting 37031 1727204398.91319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/AnsiballZ_command.py && sleep 0' 37031 1727204398.91908: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204398.91916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.91926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.91939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.91977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.91984: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204398.91994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.92007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204398.92013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204398.92019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204398.92026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204398.92035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204398.92047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204398.92054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204398.92060: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204398.92071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204398.92141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204398.92155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204398.92166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204398.92538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.06200: stdout chunk (state=3): >>> {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether b2:25:e8:19:99:d4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::be0:f1c6:76f9:f97d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 14:59:59.057171", "end": "2024-09-24 14:59:59.060958", "delta": "0:00:00.003787", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204399.07496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204399.07501: stdout chunk (state=3): >>><<< 37031 1727204399.07504: stderr chunk (state=3): >>><<< 37031 1727204399.07531: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether b2:25:e8:19:99:d4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::be0:f1c6:76f9:f97d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-24 14:59:59.057171", "end": "2024-09-24 14:59:59.060958", "delta": "0:00:00.003787", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204399.07575: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204399.07582: _low_level_execute_command(): starting 37031 1727204399.07588: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204398.7828872-38376-244015336829862/ > /dev/null 2>&1 && sleep 0' 37031 1727204399.09138: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.09142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.09304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.09308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.09342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204399.09347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.09530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.09543: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.09549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.09651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.11514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204399.11519: stderr chunk (state=3): >>><<< 37031 1727204399.11521: stdout chunk (state=3): >>><<< 37031 1727204399.11541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204399.11547: handler run complete 37031 1727204399.11578: Evaluated conditional (False): False 37031 1727204399.11590: attempt loop complete, returning result 37031 1727204399.11593: _execute() done 37031 1727204399.11595: dumping result to json 37031 1727204399.11602: done dumping result, returning 37031 1727204399.11611: done running TaskExecutor() for managed-node2/TASK: Get ip address information [0affcd87-79f5-b754-dfb8-00000000005e] 37031 1727204399.11617: sending task result for task 0affcd87-79f5-b754-dfb8-00000000005e 37031 1727204399.11731: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000005e 37031 1727204399.11733: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003787", "end": "2024-09-24 14:59:59.060958", "rc": 0, "start": "2024-09-24 14:59:59.057171" } STDOUT: 31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether b2:25:e8:19:99:d4 brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::be0:f1c6:76f9:f97d/64 scope link noprefixroute valid_lft forever preferred_lft forever 37031 1727204399.11847: no more pending results, returning what we have 37031 1727204399.11851: results queue empty 37031 1727204399.11852: checking for any_errors_fatal 37031 1727204399.11858: done checking for any_errors_fatal 37031 1727204399.11858: checking for max_fail_percentage 37031 1727204399.11860: done checking for max_fail_percentage 37031 1727204399.11861: checking to see if all hosts have failed and the running result is not ok 37031 1727204399.11862: done checking to see if all hosts have failed 37031 1727204399.11863: getting the remaining hosts for this loop 37031 1727204399.11866: done getting the remaining hosts for this loop 37031 1727204399.11871: getting the next task for host managed-node2 37031 1727204399.11878: done getting next task for host managed-node2 37031 1727204399.11882: ^ task is: TASK: Show ip_addr 37031 1727204399.11884: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204399.11888: getting variables 37031 1727204399.11890: in VariableManager get_vars() 37031 1727204399.11936: Calling all_inventory to load vars for managed-node2 37031 1727204399.11939: Calling groups_inventory to load vars for managed-node2 37031 1727204399.11941: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204399.11953: Calling all_plugins_play to load vars for managed-node2 37031 1727204399.11955: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204399.11959: Calling groups_plugins_play to load vars for managed-node2 37031 1727204399.13681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204399.15699: done with get_vars() 37031 1727204399.15846: done getting variables 37031 1727204399.15913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Tuesday 24 September 2024 14:59:59 -0400 (0:00:00.436) 0:00:21.705 ***** 37031 1727204399.16113: entering _queue_task() for managed-node2/debug 37031 1727204399.16493: worker is 1 (out of 1 available) 37031 1727204399.16511: exiting _queue_task() for managed-node2/debug 37031 1727204399.16526: done queuing things up, now waiting for results queue to drain 37031 1727204399.16528: waiting for pending results... 37031 1727204399.16831: running TaskExecutor() for managed-node2/TASK: Show ip_addr 37031 1727204399.17298: in run() - task 0affcd87-79f5-b754-dfb8-00000000005f 37031 1727204399.17302: variable 'ansible_search_path' from source: unknown 37031 1727204399.17306: calling self._execute() 37031 1727204399.17309: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.17312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.17314: variable 'omit' from source: magic vars 37031 1727204399.17774: variable 'ansible_distribution_major_version' from source: facts 37031 1727204399.17777: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204399.17780: variable 'omit' from source: magic vars 37031 1727204399.17783: variable 'omit' from source: magic vars 37031 1727204399.17786: variable 'omit' from source: magic vars 37031 1727204399.17789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204399.17884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204399.17887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204399.17890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.17892: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.17895: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204399.17897: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.17898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.17905: Set connection var ansible_connection to ssh 37031 1727204399.17907: Set connection var ansible_shell_type to sh 37031 1727204399.17909: Set connection var ansible_pipelining to False 37031 1727204399.17911: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204399.17913: Set connection var ansible_timeout to 10 37031 1727204399.17915: Set connection var ansible_shell_executable to /bin/sh 37031 1727204399.17992: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.17996: variable 'ansible_connection' from source: unknown 37031 1727204399.17998: variable 'ansible_module_compression' from source: unknown 37031 1727204399.18001: variable 'ansible_shell_type' from source: unknown 37031 1727204399.18003: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.18005: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.18007: variable 'ansible_pipelining' from source: unknown 37031 1727204399.18009: variable 'ansible_timeout' from source: unknown 37031 1727204399.18011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.18089: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204399.18099: variable 'omit' from source: magic vars 37031 1727204399.18105: starting attempt loop 37031 1727204399.18108: running the handler 37031 1727204399.18227: variable 'ip_addr' from source: set_fact 37031 1727204399.18249: handler run complete 37031 1727204399.18271: attempt loop complete, returning result 37031 1727204399.18274: _execute() done 37031 1727204399.18277: dumping result to json 37031 1727204399.18279: done dumping result, returning 37031 1727204399.18288: done running TaskExecutor() for managed-node2/TASK: Show ip_addr [0affcd87-79f5-b754-dfb8-00000000005f] 37031 1727204399.18290: sending task result for task 0affcd87-79f5-b754-dfb8-00000000005f ok: [managed-node2] => { "ip_addr.stdout": "31: veth0@if30: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether b2:25:e8:19:99:d4 brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::be0:f1c6:76f9:f97d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 37031 1727204399.18425: no more pending results, returning what we have 37031 1727204399.18429: results queue empty 37031 1727204399.18430: checking for any_errors_fatal 37031 1727204399.18439: done checking for any_errors_fatal 37031 1727204399.18440: checking for max_fail_percentage 37031 1727204399.18442: done checking for max_fail_percentage 37031 1727204399.18443: checking to see if all hosts have failed and the running result is not ok 37031 1727204399.18445: done checking to see if all hosts have failed 37031 1727204399.18445: getting the remaining hosts for this loop 37031 1727204399.18447: done getting the remaining hosts for this loop 37031 1727204399.18452: getting the next task for host managed-node2 37031 1727204399.18461: done getting next task for host managed-node2 37031 1727204399.18467: ^ task is: TASK: Assert ipv6 addresses are correctly set 37031 1727204399.18469: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204399.18473: getting variables 37031 1727204399.18476: in VariableManager get_vars() 37031 1727204399.18532: Calling all_inventory to load vars for managed-node2 37031 1727204399.18535: Calling groups_inventory to load vars for managed-node2 37031 1727204399.18538: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204399.18551: Calling all_plugins_play to load vars for managed-node2 37031 1727204399.18554: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204399.18559: Calling groups_plugins_play to load vars for managed-node2 37031 1727204399.19211: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000005f 37031 1727204399.19216: WORKER PROCESS EXITING 37031 1727204399.20845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204399.23617: done with get_vars() 37031 1727204399.23644: done getting variables 37031 1727204399.23713: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Tuesday 24 September 2024 14:59:59 -0400 (0:00:00.076) 0:00:21.782 ***** 37031 1727204399.23742: entering _queue_task() for managed-node2/assert 37031 1727204399.24636: worker is 1 (out of 1 available) 37031 1727204399.24657: exiting _queue_task() for managed-node2/assert 37031 1727204399.24701: done queuing things up, now waiting for results queue to drain 37031 1727204399.24703: waiting for pending results... 37031 1727204399.25012: running TaskExecutor() for managed-node2/TASK: Assert ipv6 addresses are correctly set 37031 1727204399.25103: in run() - task 0affcd87-79f5-b754-dfb8-000000000060 37031 1727204399.25116: variable 'ansible_search_path' from source: unknown 37031 1727204399.25159: calling self._execute() 37031 1727204399.25273: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.25280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.25290: variable 'omit' from source: magic vars 37031 1727204399.25677: variable 'ansible_distribution_major_version' from source: facts 37031 1727204399.25696: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204399.25702: variable 'omit' from source: magic vars 37031 1727204399.25725: variable 'omit' from source: magic vars 37031 1727204399.25766: variable 'omit' from source: magic vars 37031 1727204399.25814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204399.25847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204399.25872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204399.25894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.25910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.25940: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204399.25943: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.25945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.26049: Set connection var ansible_connection to ssh 37031 1727204399.26052: Set connection var ansible_shell_type to sh 37031 1727204399.26062: Set connection var ansible_pipelining to False 37031 1727204399.26073: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204399.26079: Set connection var ansible_timeout to 10 37031 1727204399.26084: Set connection var ansible_shell_executable to /bin/sh 37031 1727204399.26120: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.26124: variable 'ansible_connection' from source: unknown 37031 1727204399.26126: variable 'ansible_module_compression' from source: unknown 37031 1727204399.26129: variable 'ansible_shell_type' from source: unknown 37031 1727204399.26131: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.26133: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.26137: variable 'ansible_pipelining' from source: unknown 37031 1727204399.26140: variable 'ansible_timeout' from source: unknown 37031 1727204399.26144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.26296: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204399.26306: variable 'omit' from source: magic vars 37031 1727204399.26312: starting attempt loop 37031 1727204399.26315: running the handler 37031 1727204399.27877: variable 'ip_addr' from source: set_fact 37031 1727204399.27881: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 37031 1727204399.27884: variable 'ip_addr' from source: set_fact 37031 1727204399.27888: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 37031 1727204399.27890: variable 'ip_addr' from source: set_fact 37031 1727204399.27891: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 37031 1727204399.27893: handler run complete 37031 1727204399.27895: attempt loop complete, returning result 37031 1727204399.27897: _execute() done 37031 1727204399.27898: dumping result to json 37031 1727204399.27903: done dumping result, returning 37031 1727204399.27905: done running TaskExecutor() for managed-node2/TASK: Assert ipv6 addresses are correctly set [0affcd87-79f5-b754-dfb8-000000000060] 37031 1727204399.27907: sending task result for task 0affcd87-79f5-b754-dfb8-000000000060 37031 1727204399.27980: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000060 37031 1727204399.27984: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204399.28044: no more pending results, returning what we have 37031 1727204399.28047: results queue empty 37031 1727204399.28048: checking for any_errors_fatal 37031 1727204399.28052: done checking for any_errors_fatal 37031 1727204399.28053: checking for max_fail_percentage 37031 1727204399.28054: done checking for max_fail_percentage 37031 1727204399.28059: checking to see if all hosts have failed and the running result is not ok 37031 1727204399.28060: done checking to see if all hosts have failed 37031 1727204399.28060: getting the remaining hosts for this loop 37031 1727204399.28062: done getting the remaining hosts for this loop 37031 1727204399.28065: getting the next task for host managed-node2 37031 1727204399.28075: done getting next task for host managed-node2 37031 1727204399.28077: ^ task is: TASK: Get ipv6 routes 37031 1727204399.28079: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204399.28081: getting variables 37031 1727204399.28083: in VariableManager get_vars() 37031 1727204399.28120: Calling all_inventory to load vars for managed-node2 37031 1727204399.28122: Calling groups_inventory to load vars for managed-node2 37031 1727204399.28124: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204399.28133: Calling all_plugins_play to load vars for managed-node2 37031 1727204399.28135: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204399.28137: Calling groups_plugins_play to load vars for managed-node2 37031 1727204399.47902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204399.54847: done with get_vars() 37031 1727204399.55008: done getting variables 37031 1727204399.55190: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Tuesday 24 September 2024 14:59:59 -0400 (0:00:00.314) 0:00:22.097 ***** 37031 1727204399.55218: entering _queue_task() for managed-node2/command 37031 1727204399.56353: worker is 1 (out of 1 available) 37031 1727204399.56370: exiting _queue_task() for managed-node2/command 37031 1727204399.56385: done queuing things up, now waiting for results queue to drain 37031 1727204399.56386: waiting for pending results... 37031 1727204399.57471: running TaskExecutor() for managed-node2/TASK: Get ipv6 routes 37031 1727204399.57696: in run() - task 0affcd87-79f5-b754-dfb8-000000000061 37031 1727204399.57716: variable 'ansible_search_path' from source: unknown 37031 1727204399.57758: calling self._execute() 37031 1727204399.57956: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.57978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.58010: variable 'omit' from source: magic vars 37031 1727204399.58629: variable 'ansible_distribution_major_version' from source: facts 37031 1727204399.58655: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204399.58677: variable 'omit' from source: magic vars 37031 1727204399.58732: variable 'omit' from source: magic vars 37031 1727204399.58812: variable 'omit' from source: magic vars 37031 1727204399.58858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204399.58913: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204399.58940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204399.58976: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.59001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204399.59038: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204399.59047: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.59055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.59182: Set connection var ansible_connection to ssh 37031 1727204399.59192: Set connection var ansible_shell_type to sh 37031 1727204399.59214: Set connection var ansible_pipelining to False 37031 1727204399.59227: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204399.59240: Set connection var ansible_timeout to 10 37031 1727204399.59250: Set connection var ansible_shell_executable to /bin/sh 37031 1727204399.59312: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.59322: variable 'ansible_connection' from source: unknown 37031 1727204399.59339: variable 'ansible_module_compression' from source: unknown 37031 1727204399.59349: variable 'ansible_shell_type' from source: unknown 37031 1727204399.59357: variable 'ansible_shell_executable' from source: unknown 37031 1727204399.59381: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204399.59400: variable 'ansible_pipelining' from source: unknown 37031 1727204399.59429: variable 'ansible_timeout' from source: unknown 37031 1727204399.59440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204399.59651: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204399.59676: variable 'omit' from source: magic vars 37031 1727204399.59704: starting attempt loop 37031 1727204399.59712: running the handler 37031 1727204399.59732: _low_level_execute_command(): starting 37031 1727204399.59758: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204399.60864: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204399.60899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.60916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.60948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.61011: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.61024: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.61038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.61099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.61114: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.61136: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.61149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.61181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.61210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.61226: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.61238: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.61253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.61349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.61376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.61393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.61473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.63131: stdout chunk (state=3): >>>/root <<< 37031 1727204399.63343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204399.63347: stdout chunk (state=3): >>><<< 37031 1727204399.63350: stderr chunk (state=3): >>><<< 37031 1727204399.63488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204399.63492: _low_level_execute_command(): starting 37031 1727204399.63495: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568 `" && echo ansible-tmp-1727204399.6338263-38413-100659609643568="` echo /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568 `" ) && sleep 0' 37031 1727204399.66208: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.66824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.66859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.66903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.66915: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.66927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.66952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.66972: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.66984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.66996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.67008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.67021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.67031: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.67040: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.67051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.67131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.67153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.67175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.67254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.69211: stdout chunk (state=3): >>>ansible-tmp-1727204399.6338263-38413-100659609643568=/root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568 <<< 37031 1727204399.69429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204399.69467: stderr chunk (state=3): >>><<< 37031 1727204399.69470: stdout chunk (state=3): >>><<< 37031 1727204399.69725: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204399.6338263-38413-100659609643568=/root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204399.69728: variable 'ansible_module_compression' from source: unknown 37031 1727204399.69731: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204399.69733: variable 'ansible_facts' from source: unknown 37031 1727204399.69753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/AnsiballZ_command.py 37031 1727204399.70048: Sending initial data 37031 1727204399.70051: Sent initial data (156 bytes) 37031 1727204399.73182: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204399.73205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.73222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.73240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.73291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.73306: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.73327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.73349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.73373: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.73387: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.73399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.73413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.73433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.73445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.73461: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.73477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.73562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.73591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.73608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.73691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.75520: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204399.75567: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204399.75616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp9uqtxxrz /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/AnsiballZ_command.py <<< 37031 1727204399.75661: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204399.77068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204399.77174: stderr chunk (state=3): >>><<< 37031 1727204399.77177: stdout chunk (state=3): >>><<< 37031 1727204399.77180: done transferring module to remote 37031 1727204399.77182: _low_level_execute_command(): starting 37031 1727204399.77188: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/ /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/AnsiballZ_command.py && sleep 0' 37031 1727204399.78919: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204399.78972: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.79023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.79044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.79098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.79111: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.79127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.79146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.79167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.79183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.79196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.79211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.79227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.79241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.79254: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.79278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.79363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.79396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.79416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.79491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.81316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204399.81319: stdout chunk (state=3): >>><<< 37031 1727204399.81322: stderr chunk (state=3): >>><<< 37031 1727204399.81493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204399.81497: _low_level_execute_command(): starting 37031 1727204399.81500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/AnsiballZ_command.py && sleep 0' 37031 1727204399.83039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204399.83062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.83086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.83109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.83246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.83262: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.83279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.83298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.83310: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.83321: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.83333: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.83347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.83367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.83380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.83392: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.83406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.83483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.83603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.83694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204399.97245: stdout chunk (state=3): >>> {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:59:59.968197", "end": "2024-09-24 14:59:59.971605", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204399.98477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204399.98482: stdout chunk (state=3): >>><<< 37031 1727204399.98487: stderr chunk (state=3): >>><<< 37031 1727204399.98507: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 14:59:59.968197", "end": "2024-09-24 14:59:59.971605", "delta": "0:00:00.003408", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204399.98550: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204399.98560: _low_level_execute_command(): starting 37031 1727204399.98563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204399.6338263-38413-100659609643568/ > /dev/null 2>&1 && sleep 0' 37031 1727204399.99394: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204399.99581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.99592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.99607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.99649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.99659: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204399.99667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.99685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204399.99692: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204399.99699: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204399.99707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204399.99716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204399.99729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204399.99736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204399.99743: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204399.99753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204399.99824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204399.99917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204399.99929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204399.99993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204400.01906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204400.01910: stdout chunk (state=3): >>><<< 37031 1727204400.01917: stderr chunk (state=3): >>><<< 37031 1727204400.01939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204400.02076: handler run complete 37031 1727204400.02079: Evaluated conditional (False): False 37031 1727204400.02081: attempt loop complete, returning result 37031 1727204400.02084: _execute() done 37031 1727204400.02085: dumping result to json 37031 1727204400.02087: done dumping result, returning 37031 1727204400.02089: done running TaskExecutor() for managed-node2/TASK: Get ipv6 routes [0affcd87-79f5-b754-dfb8-000000000061] 37031 1727204400.02090: sending task result for task 0affcd87-79f5-b754-dfb8-000000000061 37031 1727204400.02155: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000061 37031 1727204400.02162: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003408", "end": "2024-09-24 14:59:59.971605", "rc": 0, "start": "2024-09-24 14:59:59.968197" } STDOUT: ::1 dev lo proto kernel metric 256 pref medium 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 37031 1727204400.02235: no more pending results, returning what we have 37031 1727204400.02238: results queue empty 37031 1727204400.02239: checking for any_errors_fatal 37031 1727204400.02246: done checking for any_errors_fatal 37031 1727204400.02246: checking for max_fail_percentage 37031 1727204400.02248: done checking for max_fail_percentage 37031 1727204400.02249: checking to see if all hosts have failed and the running result is not ok 37031 1727204400.02250: done checking to see if all hosts have failed 37031 1727204400.02250: getting the remaining hosts for this loop 37031 1727204400.02252: done getting the remaining hosts for this loop 37031 1727204400.02258: getting the next task for host managed-node2 37031 1727204400.02263: done getting next task for host managed-node2 37031 1727204400.02268: ^ task is: TASK: Show ipv6_route 37031 1727204400.02269: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204400.02272: getting variables 37031 1727204400.02274: in VariableManager get_vars() 37031 1727204400.02310: Calling all_inventory to load vars for managed-node2 37031 1727204400.02312: Calling groups_inventory to load vars for managed-node2 37031 1727204400.02314: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204400.02324: Calling all_plugins_play to load vars for managed-node2 37031 1727204400.02326: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204400.02329: Calling groups_plugins_play to load vars for managed-node2 37031 1727204400.05154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204400.06638: done with get_vars() 37031 1727204400.06666: done getting variables 37031 1727204400.06743: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Tuesday 24 September 2024 15:00:00 -0400 (0:00:00.515) 0:00:22.612 ***** 37031 1727204400.06779: entering _queue_task() for managed-node2/debug 37031 1727204400.07302: worker is 1 (out of 1 available) 37031 1727204400.07351: exiting _queue_task() for managed-node2/debug 37031 1727204400.08296: done queuing things up, now waiting for results queue to drain 37031 1727204400.08300: waiting for pending results... 37031 1727204400.09120: running TaskExecutor() for managed-node2/TASK: Show ipv6_route 37031 1727204400.09238: in run() - task 0affcd87-79f5-b754-dfb8-000000000062 37031 1727204400.09280: variable 'ansible_search_path' from source: unknown 37031 1727204400.09344: calling self._execute() 37031 1727204400.09849: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.09862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.09882: variable 'omit' from source: magic vars 37031 1727204400.10300: variable 'ansible_distribution_major_version' from source: facts 37031 1727204400.10318: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204400.10329: variable 'omit' from source: magic vars 37031 1727204400.10360: variable 'omit' from source: magic vars 37031 1727204400.10487: variable 'omit' from source: magic vars 37031 1727204400.10546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204400.10592: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204400.10633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204400.10637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.10651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.10682: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204400.10688: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.10691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.10772: Set connection var ansible_connection to ssh 37031 1727204400.10776: Set connection var ansible_shell_type to sh 37031 1727204400.10782: Set connection var ansible_pipelining to False 37031 1727204400.10791: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204400.10796: Set connection var ansible_timeout to 10 37031 1727204400.10801: Set connection var ansible_shell_executable to /bin/sh 37031 1727204400.10823: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.10825: variable 'ansible_connection' from source: unknown 37031 1727204400.10828: variable 'ansible_module_compression' from source: unknown 37031 1727204400.10831: variable 'ansible_shell_type' from source: unknown 37031 1727204400.10833: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.10836: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.10840: variable 'ansible_pipelining' from source: unknown 37031 1727204400.10842: variable 'ansible_timeout' from source: unknown 37031 1727204400.10847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.10959: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204400.10977: variable 'omit' from source: magic vars 37031 1727204400.10980: starting attempt loop 37031 1727204400.10983: running the handler 37031 1727204400.11090: variable 'ipv6_route' from source: set_fact 37031 1727204400.11105: handler run complete 37031 1727204400.11119: attempt loop complete, returning result 37031 1727204400.11122: _execute() done 37031 1727204400.11124: dumping result to json 37031 1727204400.11128: done dumping result, returning 37031 1727204400.11135: done running TaskExecutor() for managed-node2/TASK: Show ipv6_route [0affcd87-79f5-b754-dfb8-000000000062] 37031 1727204400.11139: sending task result for task 0affcd87-79f5-b754-dfb8-000000000062 37031 1727204400.11284: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000062 37031 1727204400.11288: WORKER PROCESS EXITING ok: [managed-node2] => { "ipv6_route.stdout": "::1 dev lo proto kernel metric 256 pref medium\n2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 37031 1727204400.11334: no more pending results, returning what we have 37031 1727204400.11338: results queue empty 37031 1727204400.11339: checking for any_errors_fatal 37031 1727204400.11347: done checking for any_errors_fatal 37031 1727204400.11348: checking for max_fail_percentage 37031 1727204400.11350: done checking for max_fail_percentage 37031 1727204400.11351: checking to see if all hosts have failed and the running result is not ok 37031 1727204400.11352: done checking to see if all hosts have failed 37031 1727204400.11353: getting the remaining hosts for this loop 37031 1727204400.11355: done getting the remaining hosts for this loop 37031 1727204400.11359: getting the next task for host managed-node2 37031 1727204400.11367: done getting next task for host managed-node2 37031 1727204400.11370: ^ task is: TASK: Assert default ipv6 route is set 37031 1727204400.11372: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204400.11376: getting variables 37031 1727204400.11378: in VariableManager get_vars() 37031 1727204400.11425: Calling all_inventory to load vars for managed-node2 37031 1727204400.11428: Calling groups_inventory to load vars for managed-node2 37031 1727204400.11430: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204400.11441: Calling all_plugins_play to load vars for managed-node2 37031 1727204400.11444: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204400.11447: Calling groups_plugins_play to load vars for managed-node2 37031 1727204400.13245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204400.15213: done with get_vars() 37031 1727204400.15245: done getting variables 37031 1727204400.15305: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Tuesday 24 September 2024 15:00:00 -0400 (0:00:00.085) 0:00:22.698 ***** 37031 1727204400.15343: entering _queue_task() for managed-node2/assert 37031 1727204400.15784: worker is 1 (out of 1 available) 37031 1727204400.15797: exiting _queue_task() for managed-node2/assert 37031 1727204400.15810: done queuing things up, now waiting for results queue to drain 37031 1727204400.15811: waiting for pending results... 37031 1727204400.16124: running TaskExecutor() for managed-node2/TASK: Assert default ipv6 route is set 37031 1727204400.16244: in run() - task 0affcd87-79f5-b754-dfb8-000000000063 37031 1727204400.16271: variable 'ansible_search_path' from source: unknown 37031 1727204400.16322: calling self._execute() 37031 1727204400.16435: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.16447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.16462: variable 'omit' from source: magic vars 37031 1727204400.16894: variable 'ansible_distribution_major_version' from source: facts 37031 1727204400.16944: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204400.16960: variable 'omit' from source: magic vars 37031 1727204400.17012: variable 'omit' from source: magic vars 37031 1727204400.17108: variable 'omit' from source: magic vars 37031 1727204400.17200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204400.17306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204400.17378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204400.17451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.17497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.17578: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204400.17606: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.17620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.17828: Set connection var ansible_connection to ssh 37031 1727204400.17841: Set connection var ansible_shell_type to sh 37031 1727204400.17853: Set connection var ansible_pipelining to False 37031 1727204400.17895: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204400.17910: Set connection var ansible_timeout to 10 37031 1727204400.17922: Set connection var ansible_shell_executable to /bin/sh 37031 1727204400.17976: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.17985: variable 'ansible_connection' from source: unknown 37031 1727204400.17999: variable 'ansible_module_compression' from source: unknown 37031 1727204400.18006: variable 'ansible_shell_type' from source: unknown 37031 1727204400.18017: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.18024: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.18031: variable 'ansible_pipelining' from source: unknown 37031 1727204400.18039: variable 'ansible_timeout' from source: unknown 37031 1727204400.18047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.18302: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204400.18341: variable 'omit' from source: magic vars 37031 1727204400.18354: starting attempt loop 37031 1727204400.18384: running the handler 37031 1727204400.18698: variable '__test_str' from source: task vars 37031 1727204400.18851: variable 'interface' from source: play vars 37031 1727204400.18876: variable 'ipv6_route' from source: set_fact 37031 1727204400.18901: Evaluated conditional (__test_str in ipv6_route.stdout): True 37031 1727204400.18931: handler run complete 37031 1727204400.18968: attempt loop complete, returning result 37031 1727204400.18990: _execute() done 37031 1727204400.19019: dumping result to json 37031 1727204400.19031: done dumping result, returning 37031 1727204400.19071: done running TaskExecutor() for managed-node2/TASK: Assert default ipv6 route is set [0affcd87-79f5-b754-dfb8-000000000063] 37031 1727204400.19083: sending task result for task 0affcd87-79f5-b754-dfb8-000000000063 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 37031 1727204400.19365: no more pending results, returning what we have 37031 1727204400.19369: results queue empty 37031 1727204400.19370: checking for any_errors_fatal 37031 1727204400.19377: done checking for any_errors_fatal 37031 1727204400.19378: checking for max_fail_percentage 37031 1727204400.19380: done checking for max_fail_percentage 37031 1727204400.19381: checking to see if all hosts have failed and the running result is not ok 37031 1727204400.19382: done checking to see if all hosts have failed 37031 1727204400.19383: getting the remaining hosts for this loop 37031 1727204400.19384: done getting the remaining hosts for this loop 37031 1727204400.19389: getting the next task for host managed-node2 37031 1727204400.19396: done getting next task for host managed-node2 37031 1727204400.19400: ^ task is: TASK: Ensure ping6 command is present 37031 1727204400.19402: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204400.19405: getting variables 37031 1727204400.19407: in VariableManager get_vars() 37031 1727204400.19454: Calling all_inventory to load vars for managed-node2 37031 1727204400.19457: Calling groups_inventory to load vars for managed-node2 37031 1727204400.19459: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204400.19487: Calling all_plugins_play to load vars for managed-node2 37031 1727204400.19490: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204400.19494: Calling groups_plugins_play to load vars for managed-node2 37031 1727204400.20730: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000063 37031 1727204400.20733: WORKER PROCESS EXITING 37031 1727204400.22061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204400.24093: done with get_vars() 37031 1727204400.24145: done getting variables 37031 1727204400.24285: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Tuesday 24 September 2024 15:00:00 -0400 (0:00:00.089) 0:00:22.788 ***** 37031 1727204400.24322: entering _queue_task() for managed-node2/package 37031 1727204400.24840: worker is 1 (out of 1 available) 37031 1727204400.24853: exiting _queue_task() for managed-node2/package 37031 1727204400.24872: done queuing things up, now waiting for results queue to drain 37031 1727204400.24873: waiting for pending results... 37031 1727204400.25269: running TaskExecutor() for managed-node2/TASK: Ensure ping6 command is present 37031 1727204400.25456: in run() - task 0affcd87-79f5-b754-dfb8-000000000064 37031 1727204400.25479: variable 'ansible_search_path' from source: unknown 37031 1727204400.25533: calling self._execute() 37031 1727204400.25704: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.25725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.25748: variable 'omit' from source: magic vars 37031 1727204400.26180: variable 'ansible_distribution_major_version' from source: facts 37031 1727204400.26205: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204400.26216: variable 'omit' from source: magic vars 37031 1727204400.26240: variable 'omit' from source: magic vars 37031 1727204400.26462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204400.29368: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204400.29503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204400.29550: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204400.29597: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204400.29646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204400.29830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204400.29873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204400.29916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204400.30001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204400.30031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204400.30203: variable '__network_is_ostree' from source: set_fact 37031 1727204400.30213: variable 'omit' from source: magic vars 37031 1727204400.30246: variable 'omit' from source: magic vars 37031 1727204400.30301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204400.30331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204400.30352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204400.30379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.30402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204400.30446: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204400.30455: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.30463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.30592: Set connection var ansible_connection to ssh 37031 1727204400.30603: Set connection var ansible_shell_type to sh 37031 1727204400.30621: Set connection var ansible_pipelining to False 37031 1727204400.30635: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204400.30646: Set connection var ansible_timeout to 10 37031 1727204400.30657: Set connection var ansible_shell_executable to /bin/sh 37031 1727204400.30693: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.30707: variable 'ansible_connection' from source: unknown 37031 1727204400.30722: variable 'ansible_module_compression' from source: unknown 37031 1727204400.30730: variable 'ansible_shell_type' from source: unknown 37031 1727204400.30738: variable 'ansible_shell_executable' from source: unknown 37031 1727204400.30745: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204400.30753: variable 'ansible_pipelining' from source: unknown 37031 1727204400.30760: variable 'ansible_timeout' from source: unknown 37031 1727204400.30770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204400.31011: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204400.31029: variable 'omit' from source: magic vars 37031 1727204400.31045: starting attempt loop 37031 1727204400.31054: running the handler 37031 1727204400.31069: variable 'ansible_facts' from source: unknown 37031 1727204400.31078: variable 'ansible_facts' from source: unknown 37031 1727204400.31135: _low_level_execute_command(): starting 37031 1727204400.31219: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204400.32375: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204400.32391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.32410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.32436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.32483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.32495: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204400.32507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.32527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204400.32538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204400.32554: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204400.32568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.32581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.32594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.32604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.32613: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204400.32627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.32708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204400.32733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204400.32758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204400.32835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204400.34495: stdout chunk (state=3): >>>/root <<< 37031 1727204400.34698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204400.34817: stdout chunk (state=3): >>><<< 37031 1727204400.34821: stderr chunk (state=3): >>><<< 37031 1727204400.34824: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204400.34827: _low_level_execute_command(): starting 37031 1727204400.34830: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897 `" && echo ansible-tmp-1727204400.3472216-38448-51693924670897="` echo /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897 `" ) && sleep 0' 37031 1727204400.35541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204400.35557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.35576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.35598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.35639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.35649: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204400.35662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.35682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204400.35695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204400.35709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204400.35721: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.35733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.35748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.35760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.35773: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204400.35788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.35872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204400.35943: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204400.35961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204400.36040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204400.37891: stdout chunk (state=3): >>>ansible-tmp-1727204400.3472216-38448-51693924670897=/root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897 <<< 37031 1727204400.38102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204400.38106: stdout chunk (state=3): >>><<< 37031 1727204400.38108: stderr chunk (state=3): >>><<< 37031 1727204400.38372: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204400.3472216-38448-51693924670897=/root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204400.38375: variable 'ansible_module_compression' from source: unknown 37031 1727204400.38378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 37031 1727204400.38380: variable 'ansible_facts' from source: unknown 37031 1727204400.38384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/AnsiballZ_dnf.py 37031 1727204400.38937: Sending initial data 37031 1727204400.38941: Sent initial data (151 bytes) 37031 1727204400.41970: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204400.41991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.42006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.42027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.42073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.42088: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204400.42100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.43881: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204400.44161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204400.44178: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204400.44517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.44538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.44560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.44576: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.44593: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204400.44608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.44690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204400.44731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204400.44774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204400.45127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204400.46545: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204400.46583: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204400.46626: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpl4_8f6xu /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/AnsiballZ_dnf.py <<< 37031 1727204400.46663: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204400.48429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204400.48620: stderr chunk (state=3): >>><<< 37031 1727204400.48624: stdout chunk (state=3): >>><<< 37031 1727204400.48627: done transferring module to remote 37031 1727204400.48630: _low_level_execute_command(): starting 37031 1727204400.48632: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/ /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/AnsiballZ_dnf.py && sleep 0' 37031 1727204400.50194: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204400.50230: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.50240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.50253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.50358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.50362: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204400.50376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.50391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204400.50399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204400.50406: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204400.50415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.50422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.50436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.50452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.50462: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204400.50479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.50550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204400.50682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204400.50690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204400.50895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204400.52867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204400.52871: stdout chunk (state=3): >>><<< 37031 1727204400.52877: stderr chunk (state=3): >>><<< 37031 1727204400.52904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204400.52907: _low_level_execute_command(): starting 37031 1727204400.52910: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/AnsiballZ_dnf.py && sleep 0' 37031 1727204400.55367: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204400.55371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.55383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.55396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.55435: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.55445: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204400.55461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.55474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204400.55559: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204400.55575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204400.55581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204400.55590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204400.55602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204400.55609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204400.55615: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204400.55625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204400.55790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204400.55818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204400.55822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204400.55966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.50836: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 37031 1727204401.55311: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204401.55317: stdout chunk (state=3): >>><<< 37031 1727204401.55320: stderr chunk (state=3): >>><<< 37031 1727204401.55340: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204401.55391: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204401.55397: _low_level_execute_command(): starting 37031 1727204401.55402: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204400.3472216-38448-51693924670897/ > /dev/null 2>&1 && sleep 0' 37031 1727204401.56039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.56048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.56061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.56075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.56114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.56121: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.56131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.56144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.56151: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.56160: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.56167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.56176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.56187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.56194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.56201: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.56208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.56287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.56304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.56315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.56392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.58298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204401.58302: stderr chunk (state=3): >>><<< 37031 1727204401.58305: stdout chunk (state=3): >>><<< 37031 1727204401.58325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204401.58332: handler run complete 37031 1727204401.58377: attempt loop complete, returning result 37031 1727204401.58380: _execute() done 37031 1727204401.58382: dumping result to json 37031 1727204401.58388: done dumping result, returning 37031 1727204401.58396: done running TaskExecutor() for managed-node2/TASK: Ensure ping6 command is present [0affcd87-79f5-b754-dfb8-000000000064] 37031 1727204401.58401: sending task result for task 0affcd87-79f5-b754-dfb8-000000000064 37031 1727204401.58502: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000064 37031 1727204401.58505: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 37031 1727204401.58573: no more pending results, returning what we have 37031 1727204401.58577: results queue empty 37031 1727204401.58578: checking for any_errors_fatal 37031 1727204401.58584: done checking for any_errors_fatal 37031 1727204401.58585: checking for max_fail_percentage 37031 1727204401.58587: done checking for max_fail_percentage 37031 1727204401.58588: checking to see if all hosts have failed and the running result is not ok 37031 1727204401.58589: done checking to see if all hosts have failed 37031 1727204401.58590: getting the remaining hosts for this loop 37031 1727204401.58591: done getting the remaining hosts for this loop 37031 1727204401.58595: getting the next task for host managed-node2 37031 1727204401.58601: done getting next task for host managed-node2 37031 1727204401.58603: ^ task is: TASK: Test gateway can be pinged 37031 1727204401.58605: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204401.58608: getting variables 37031 1727204401.58610: in VariableManager get_vars() 37031 1727204401.58650: Calling all_inventory to load vars for managed-node2 37031 1727204401.58652: Calling groups_inventory to load vars for managed-node2 37031 1727204401.58654: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204401.58668: Calling all_plugins_play to load vars for managed-node2 37031 1727204401.58670: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204401.58673: Calling groups_plugins_play to load vars for managed-node2 37031 1727204401.60113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204401.61849: done with get_vars() 37031 1727204401.61886: done getting variables 37031 1727204401.61950: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Tuesday 24 September 2024 15:00:01 -0400 (0:00:01.378) 0:00:24.166 ***** 37031 1727204401.62134: entering _queue_task() for managed-node2/command 37031 1727204401.62490: worker is 1 (out of 1 available) 37031 1727204401.62502: exiting _queue_task() for managed-node2/command 37031 1727204401.62517: done queuing things up, now waiting for results queue to drain 37031 1727204401.62518: waiting for pending results... 37031 1727204401.62818: running TaskExecutor() for managed-node2/TASK: Test gateway can be pinged 37031 1727204401.62932: in run() - task 0affcd87-79f5-b754-dfb8-000000000065 37031 1727204401.62951: variable 'ansible_search_path' from source: unknown 37031 1727204401.62998: calling self._execute() 37031 1727204401.63141: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204401.63153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204401.63172: variable 'omit' from source: magic vars 37031 1727204401.63788: variable 'ansible_distribution_major_version' from source: facts 37031 1727204401.63807: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204401.63820: variable 'omit' from source: magic vars 37031 1727204401.63851: variable 'omit' from source: magic vars 37031 1727204401.63897: variable 'omit' from source: magic vars 37031 1727204401.63951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204401.63997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204401.64024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204401.64053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204401.64074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204401.64110: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204401.64119: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204401.64128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204401.64237: Set connection var ansible_connection to ssh 37031 1727204401.64245: Set connection var ansible_shell_type to sh 37031 1727204401.64262: Set connection var ansible_pipelining to False 37031 1727204401.64283: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204401.64295: Set connection var ansible_timeout to 10 37031 1727204401.64305: Set connection var ansible_shell_executable to /bin/sh 37031 1727204401.64337: variable 'ansible_shell_executable' from source: unknown 37031 1727204401.64347: variable 'ansible_connection' from source: unknown 37031 1727204401.64354: variable 'ansible_module_compression' from source: unknown 37031 1727204401.64368: variable 'ansible_shell_type' from source: unknown 37031 1727204401.64378: variable 'ansible_shell_executable' from source: unknown 37031 1727204401.64388: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204401.64396: variable 'ansible_pipelining' from source: unknown 37031 1727204401.64403: variable 'ansible_timeout' from source: unknown 37031 1727204401.64411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204401.64568: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204401.64588: variable 'omit' from source: magic vars 37031 1727204401.64604: starting attempt loop 37031 1727204401.64611: running the handler 37031 1727204401.64630: _low_level_execute_command(): starting 37031 1727204401.64643: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204401.65485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.65503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.65518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.65537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.65587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.65601: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.65616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.65635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.65648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.65665: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.65679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.65694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.65715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.65730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.65743: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.65759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.65950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.65980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.65997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.66075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.67733: stdout chunk (state=3): >>>/root <<< 37031 1727204401.67935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204401.67938: stdout chunk (state=3): >>><<< 37031 1727204401.67941: stderr chunk (state=3): >>><<< 37031 1727204401.68075: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204401.68078: _low_level_execute_command(): starting 37031 1727204401.68081: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004 `" && echo ansible-tmp-1727204401.67969-38511-34147750454004="` echo /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004 `" ) && sleep 0' 37031 1727204401.68731: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.68747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.68769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.68788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.68833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.68852: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.68871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.68888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.68899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.68910: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.68923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.68936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.68962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.68978: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.68990: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.69004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.69090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.69115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.69133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.69207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.71111: stdout chunk (state=3): >>>ansible-tmp-1727204401.67969-38511-34147750454004=/root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004 <<< 37031 1727204401.71287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204401.71337: stderr chunk (state=3): >>><<< 37031 1727204401.71340: stdout chunk (state=3): >>><<< 37031 1727204401.71569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204401.67969-38511-34147750454004=/root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204401.71572: variable 'ansible_module_compression' from source: unknown 37031 1727204401.71574: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204401.71576: variable 'ansible_facts' from source: unknown 37031 1727204401.71587: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/AnsiballZ_command.py 37031 1727204401.71762: Sending initial data 37031 1727204401.71767: Sent initial data (153 bytes) 37031 1727204401.73723: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.73749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.73772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.73792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.73836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.73863: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.73884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.73904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.73919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.73930: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.73942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.73955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.73987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.73999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.74010: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.74022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.74114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.74137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.74154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.74233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.75977: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204401.76058: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204401.76063: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpalh_o_x5 /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/AnsiballZ_command.py <<< 37031 1727204401.76090: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204401.77170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204401.77374: stderr chunk (state=3): >>><<< 37031 1727204401.77378: stdout chunk (state=3): >>><<< 37031 1727204401.77476: done transferring module to remote 37031 1727204401.77479: _low_level_execute_command(): starting 37031 1727204401.77486: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/ /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/AnsiballZ_command.py && sleep 0' 37031 1727204401.78176: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.78191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.78206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.78224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.78279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.78292: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.78306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.78323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.78338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.78358: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.78374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.78387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.78404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.78416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.78428: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.78443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.78521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.78540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.78554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.78632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.80504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204401.80508: stdout chunk (state=3): >>><<< 37031 1727204401.80511: stderr chunk (state=3): >>><<< 37031 1727204401.80571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204401.80575: _low_level_execute_command(): starting 37031 1727204401.80578: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/AnsiballZ_command.py && sleep 0' 37031 1727204401.82818: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204401.82826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.82838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.82851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.82896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.82902: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204401.82912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.82926: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204401.82933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204401.82940: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204401.82949: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204401.82959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.82973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.82980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.82987: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204401.82996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.83066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.83088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204401.83097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.83411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204401.97041: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:00:01.965465", "end": "2024-09-24 15:00:01.969532", "delta": "0:00:00.004067", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204401.98279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204401.98283: stdout chunk (state=3): >>><<< 37031 1727204401.98289: stderr chunk (state=3): >>><<< 37031 1727204401.98338: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1(2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-24 15:00:01.965465", "end": "2024-09-24 15:00:01.969532", "delta": "0:00:00.004067", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204401.98378: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204401.98387: _low_level_execute_command(): starting 37031 1727204401.98392: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204401.67969-38511-34147750454004/ > /dev/null 2>&1 && sleep 0' 37031 1727204401.99660: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204401.99665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204401.99704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 37031 1727204401.99708: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204401.99711: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204401.99782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204401.99785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204401.99840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204402.01648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204402.01726: stderr chunk (state=3): >>><<< 37031 1727204402.01729: stdout chunk (state=3): >>><<< 37031 1727204402.01780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204402.01783: handler run complete 37031 1727204402.01870: Evaluated conditional (False): False 37031 1727204402.01873: attempt loop complete, returning result 37031 1727204402.01875: _execute() done 37031 1727204402.01879: dumping result to json 37031 1727204402.01882: done dumping result, returning 37031 1727204402.01884: done running TaskExecutor() for managed-node2/TASK: Test gateway can be pinged [0affcd87-79f5-b754-dfb8-000000000065] 37031 1727204402.01886: sending task result for task 0affcd87-79f5-b754-dfb8-000000000065 37031 1727204402.02052: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000065 37031 1727204402.02055: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.004067", "end": "2024-09-24 15:00:01.969532", "rc": 0, "start": "2024-09-24 15:00:01.965465" } STDOUT: PING 2001:db8::1(2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.041 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 37031 1727204402.02130: no more pending results, returning what we have 37031 1727204402.02133: results queue empty 37031 1727204402.02134: checking for any_errors_fatal 37031 1727204402.02142: done checking for any_errors_fatal 37031 1727204402.02143: checking for max_fail_percentage 37031 1727204402.02144: done checking for max_fail_percentage 37031 1727204402.02145: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.02146: done checking to see if all hosts have failed 37031 1727204402.02147: getting the remaining hosts for this loop 37031 1727204402.02149: done getting the remaining hosts for this loop 37031 1727204402.02153: getting the next task for host managed-node2 37031 1727204402.02165: done getting next task for host managed-node2 37031 1727204402.02168: ^ task is: TASK: TEARDOWN: remove profiles. 37031 1727204402.02171: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204402.02174: getting variables 37031 1727204402.02177: in VariableManager get_vars() 37031 1727204402.02223: Calling all_inventory to load vars for managed-node2 37031 1727204402.02226: Calling groups_inventory to load vars for managed-node2 37031 1727204402.02229: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.02241: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.02244: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.02246: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.04381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.06206: done with get_vars() 37031 1727204402.06241: done getting variables 37031 1727204402.06304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.442) 0:00:24.608 ***** 37031 1727204402.06340: entering _queue_task() for managed-node2/debug 37031 1727204402.06697: worker is 1 (out of 1 available) 37031 1727204402.06713: exiting _queue_task() for managed-node2/debug 37031 1727204402.06726: done queuing things up, now waiting for results queue to drain 37031 1727204402.06727: waiting for pending results... 37031 1727204402.07041: running TaskExecutor() for managed-node2/TASK: TEARDOWN: remove profiles. 37031 1727204402.07491: in run() - task 0affcd87-79f5-b754-dfb8-000000000066 37031 1727204402.07511: variable 'ansible_search_path' from source: unknown 37031 1727204402.07553: calling self._execute() 37031 1727204402.07662: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.07682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.07698: variable 'omit' from source: magic vars 37031 1727204402.08556: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.08580: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.08591: variable 'omit' from source: magic vars 37031 1727204402.08619: variable 'omit' from source: magic vars 37031 1727204402.08697: variable 'omit' from source: magic vars 37031 1727204402.08812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204402.08902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204402.09004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204402.09068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204402.09198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204402.09348: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204402.09357: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.09369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.09573: Set connection var ansible_connection to ssh 37031 1727204402.09588: Set connection var ansible_shell_type to sh 37031 1727204402.09603: Set connection var ansible_pipelining to False 37031 1727204402.09642: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204402.09653: Set connection var ansible_timeout to 10 37031 1727204402.09665: Set connection var ansible_shell_executable to /bin/sh 37031 1727204402.09700: variable 'ansible_shell_executable' from source: unknown 37031 1727204402.09708: variable 'ansible_connection' from source: unknown 37031 1727204402.09715: variable 'ansible_module_compression' from source: unknown 37031 1727204402.09727: variable 'ansible_shell_type' from source: unknown 37031 1727204402.09734: variable 'ansible_shell_executable' from source: unknown 37031 1727204402.09743: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.09753: variable 'ansible_pipelining' from source: unknown 37031 1727204402.09760: variable 'ansible_timeout' from source: unknown 37031 1727204402.09770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.09924: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204402.09943: variable 'omit' from source: magic vars 37031 1727204402.09953: starting attempt loop 37031 1727204402.09962: running the handler 37031 1727204402.10016: handler run complete 37031 1727204402.10041: attempt loop complete, returning result 37031 1727204402.10049: _execute() done 37031 1727204402.10056: dumping result to json 37031 1727204402.10066: done dumping result, returning 37031 1727204402.10081: done running TaskExecutor() for managed-node2/TASK: TEARDOWN: remove profiles. [0affcd87-79f5-b754-dfb8-000000000066] 37031 1727204402.10090: sending task result for task 0affcd87-79f5-b754-dfb8-000000000066 ok: [managed-node2] => {} MSG: ################################################## 37031 1727204402.10245: no more pending results, returning what we have 37031 1727204402.10248: results queue empty 37031 1727204402.10250: checking for any_errors_fatal 37031 1727204402.10260: done checking for any_errors_fatal 37031 1727204402.10261: checking for max_fail_percentage 37031 1727204402.10265: done checking for max_fail_percentage 37031 1727204402.10266: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.10267: done checking to see if all hosts have failed 37031 1727204402.10268: getting the remaining hosts for this loop 37031 1727204402.10270: done getting the remaining hosts for this loop 37031 1727204402.10275: getting the next task for host managed-node2 37031 1727204402.10284: done getting next task for host managed-node2 37031 1727204402.10292: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 37031 1727204402.10296: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204402.10315: getting variables 37031 1727204402.10317: in VariableManager get_vars() 37031 1727204402.10361: Calling all_inventory to load vars for managed-node2 37031 1727204402.10366: Calling groups_inventory to load vars for managed-node2 37031 1727204402.10369: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.10379: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.10382: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.10385: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.11605: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000066 37031 1727204402.11609: WORKER PROCESS EXITING 37031 1727204402.12944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.15195: done with get_vars() 37031 1727204402.15221: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.089) 0:00:24.698 ***** 37031 1727204402.15332: entering _queue_task() for managed-node2/include_tasks 37031 1727204402.15682: worker is 1 (out of 1 available) 37031 1727204402.15696: exiting _queue_task() for managed-node2/include_tasks 37031 1727204402.15711: done queuing things up, now waiting for results queue to drain 37031 1727204402.15712: waiting for pending results... 37031 1727204402.15939: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 37031 1727204402.16054: in run() - task 0affcd87-79f5-b754-dfb8-00000000006e 37031 1727204402.16071: variable 'ansible_search_path' from source: unknown 37031 1727204402.16075: variable 'ansible_search_path' from source: unknown 37031 1727204402.16105: calling self._execute() 37031 1727204402.16189: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.16193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.16202: variable 'omit' from source: magic vars 37031 1727204402.16494: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.16504: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.16510: _execute() done 37031 1727204402.16513: dumping result to json 37031 1727204402.16517: done dumping result, returning 37031 1727204402.16523: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-b754-dfb8-00000000006e] 37031 1727204402.16528: sending task result for task 0affcd87-79f5-b754-dfb8-00000000006e 37031 1727204402.16622: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000006e 37031 1727204402.16624: WORKER PROCESS EXITING 37031 1727204402.16675: no more pending results, returning what we have 37031 1727204402.16681: in VariableManager get_vars() 37031 1727204402.16727: Calling all_inventory to load vars for managed-node2 37031 1727204402.16730: Calling groups_inventory to load vars for managed-node2 37031 1727204402.16733: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.16745: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.16753: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.16756: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.17594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.18971: done with get_vars() 37031 1727204402.18999: variable 'ansible_search_path' from source: unknown 37031 1727204402.19001: variable 'ansible_search_path' from source: unknown 37031 1727204402.19045: we have included files to process 37031 1727204402.19046: generating all_blocks data 37031 1727204402.19048: done generating all_blocks data 37031 1727204402.19054: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204402.19055: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204402.19058: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 37031 1727204402.19602: done processing included file 37031 1727204402.19604: iterating over new_blocks loaded from include file 37031 1727204402.19605: in VariableManager get_vars() 37031 1727204402.19624: done with get_vars() 37031 1727204402.19625: filtering new block on tags 37031 1727204402.19640: done filtering new block on tags 37031 1727204402.19642: in VariableManager get_vars() 37031 1727204402.19656: done with get_vars() 37031 1727204402.19657: filtering new block on tags 37031 1727204402.19672: done filtering new block on tags 37031 1727204402.19674: in VariableManager get_vars() 37031 1727204402.19689: done with get_vars() 37031 1727204402.19690: filtering new block on tags 37031 1727204402.19704: done filtering new block on tags 37031 1727204402.19705: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 37031 1727204402.19710: extending task lists for all hosts with included blocks 37031 1727204402.20218: done extending task lists 37031 1727204402.20219: done processing included files 37031 1727204402.20219: results queue empty 37031 1727204402.20220: checking for any_errors_fatal 37031 1727204402.20222: done checking for any_errors_fatal 37031 1727204402.20223: checking for max_fail_percentage 37031 1727204402.20224: done checking for max_fail_percentage 37031 1727204402.20224: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.20225: done checking to see if all hosts have failed 37031 1727204402.20225: getting the remaining hosts for this loop 37031 1727204402.20226: done getting the remaining hosts for this loop 37031 1727204402.20228: getting the next task for host managed-node2 37031 1727204402.20231: done getting next task for host managed-node2 37031 1727204402.20233: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 37031 1727204402.20235: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204402.20244: getting variables 37031 1727204402.20245: in VariableManager get_vars() 37031 1727204402.20259: Calling all_inventory to load vars for managed-node2 37031 1727204402.20260: Calling groups_inventory to load vars for managed-node2 37031 1727204402.20262: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.20267: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.20268: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.20270: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.21048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.21970: done with get_vars() 37031 1727204402.21995: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.067) 0:00:24.765 ***** 37031 1727204402.22077: entering _queue_task() for managed-node2/setup 37031 1727204402.22406: worker is 1 (out of 1 available) 37031 1727204402.22420: exiting _queue_task() for managed-node2/setup 37031 1727204402.22434: done queuing things up, now waiting for results queue to drain 37031 1727204402.22436: waiting for pending results... 37031 1727204402.22753: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 37031 1727204402.22939: in run() - task 0affcd87-79f5-b754-dfb8-000000000513 37031 1727204402.22960: variable 'ansible_search_path' from source: unknown 37031 1727204402.22972: variable 'ansible_search_path' from source: unknown 37031 1727204402.23022: calling self._execute() 37031 1727204402.23129: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.23142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.23156: variable 'omit' from source: magic vars 37031 1727204402.23545: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.23566: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.23804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204402.26186: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204402.26282: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204402.26327: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204402.26373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204402.26406: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204402.26496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204402.26531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204402.26562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204402.26613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204402.26632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204402.26696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204402.26725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204402.26754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204402.26805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204402.26825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204402.27003: variable '__network_required_facts' from source: role '' defaults 37031 1727204402.27021: variable 'ansible_facts' from source: unknown 37031 1727204402.27814: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 37031 1727204402.27823: when evaluation is False, skipping this task 37031 1727204402.27830: _execute() done 37031 1727204402.27836: dumping result to json 37031 1727204402.27842: done dumping result, returning 37031 1727204402.27855: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-b754-dfb8-000000000513] 37031 1727204402.27865: sending task result for task 0affcd87-79f5-b754-dfb8-000000000513 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204402.28020: no more pending results, returning what we have 37031 1727204402.28025: results queue empty 37031 1727204402.28026: checking for any_errors_fatal 37031 1727204402.28028: done checking for any_errors_fatal 37031 1727204402.28029: checking for max_fail_percentage 37031 1727204402.28030: done checking for max_fail_percentage 37031 1727204402.28031: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.28033: done checking to see if all hosts have failed 37031 1727204402.28033: getting the remaining hosts for this loop 37031 1727204402.28036: done getting the remaining hosts for this loop 37031 1727204402.28041: getting the next task for host managed-node2 37031 1727204402.28051: done getting next task for host managed-node2 37031 1727204402.28056: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 37031 1727204402.28060: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204402.28079: getting variables 37031 1727204402.28082: in VariableManager get_vars() 37031 1727204402.28128: Calling all_inventory to load vars for managed-node2 37031 1727204402.28132: Calling groups_inventory to load vars for managed-node2 37031 1727204402.28134: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.28145: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.28148: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.28151: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.29192: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000513 37031 1727204402.29196: WORKER PROCESS EXITING 37031 1727204402.29897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.31583: done with get_vars() 37031 1727204402.31618: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.096) 0:00:24.862 ***** 37031 1727204402.31735: entering _queue_task() for managed-node2/stat 37031 1727204402.32085: worker is 1 (out of 1 available) 37031 1727204402.32099: exiting _queue_task() for managed-node2/stat 37031 1727204402.32113: done queuing things up, now waiting for results queue to drain 37031 1727204402.32114: waiting for pending results... 37031 1727204402.32425: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 37031 1727204402.32606: in run() - task 0affcd87-79f5-b754-dfb8-000000000515 37031 1727204402.32627: variable 'ansible_search_path' from source: unknown 37031 1727204402.32635: variable 'ansible_search_path' from source: unknown 37031 1727204402.32682: calling self._execute() 37031 1727204402.32780: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.32791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.32802: variable 'omit' from source: magic vars 37031 1727204402.33157: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.33176: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.33338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204402.33605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204402.33657: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204402.33696: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204402.33734: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204402.33825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204402.33860: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204402.33895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204402.33924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204402.34020: variable '__network_is_ostree' from source: set_fact 37031 1727204402.34032: Evaluated conditional (not __network_is_ostree is defined): False 37031 1727204402.34041: when evaluation is False, skipping this task 37031 1727204402.34048: _execute() done 37031 1727204402.34055: dumping result to json 37031 1727204402.34063: done dumping result, returning 37031 1727204402.34078: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-b754-dfb8-000000000515] 37031 1727204402.34087: sending task result for task 0affcd87-79f5-b754-dfb8-000000000515 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 37031 1727204402.34236: no more pending results, returning what we have 37031 1727204402.34241: results queue empty 37031 1727204402.34242: checking for any_errors_fatal 37031 1727204402.34248: done checking for any_errors_fatal 37031 1727204402.34249: checking for max_fail_percentage 37031 1727204402.34251: done checking for max_fail_percentage 37031 1727204402.34252: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.34253: done checking to see if all hosts have failed 37031 1727204402.34254: getting the remaining hosts for this loop 37031 1727204402.34256: done getting the remaining hosts for this loop 37031 1727204402.34260: getting the next task for host managed-node2 37031 1727204402.34270: done getting next task for host managed-node2 37031 1727204402.34274: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 37031 1727204402.34278: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204402.34296: getting variables 37031 1727204402.34298: in VariableManager get_vars() 37031 1727204402.34341: Calling all_inventory to load vars for managed-node2 37031 1727204402.34344: Calling groups_inventory to load vars for managed-node2 37031 1727204402.34346: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.34355: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.34357: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.34360: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.35388: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000515 37031 1727204402.35392: WORKER PROCESS EXITING 37031 1727204402.36200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.37827: done with get_vars() 37031 1727204402.37859: done getting variables 37031 1727204402.37922: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.062) 0:00:24.924 ***** 37031 1727204402.37961: entering _queue_task() for managed-node2/set_fact 37031 1727204402.38300: worker is 1 (out of 1 available) 37031 1727204402.38312: exiting _queue_task() for managed-node2/set_fact 37031 1727204402.38325: done queuing things up, now waiting for results queue to drain 37031 1727204402.38327: waiting for pending results... 37031 1727204402.38660: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 37031 1727204402.38862: in run() - task 0affcd87-79f5-b754-dfb8-000000000516 37031 1727204402.38888: variable 'ansible_search_path' from source: unknown 37031 1727204402.38900: variable 'ansible_search_path' from source: unknown 37031 1727204402.38941: calling self._execute() 37031 1727204402.39049: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.39061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.39078: variable 'omit' from source: magic vars 37031 1727204402.39461: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.39483: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.39662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204402.39936: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204402.40010: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204402.40058: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204402.40148: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204402.40281: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204402.40333: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204402.40362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204402.40405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204402.40479: variable '__network_is_ostree' from source: set_fact 37031 1727204402.40485: Evaluated conditional (not __network_is_ostree is defined): False 37031 1727204402.40491: when evaluation is False, skipping this task 37031 1727204402.40499: _execute() done 37031 1727204402.40502: dumping result to json 37031 1727204402.40505: done dumping result, returning 37031 1727204402.40513: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-b754-dfb8-000000000516] 37031 1727204402.40517: sending task result for task 0affcd87-79f5-b754-dfb8-000000000516 37031 1727204402.40607: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000516 37031 1727204402.40610: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 37031 1727204402.40680: no more pending results, returning what we have 37031 1727204402.40684: results queue empty 37031 1727204402.40685: checking for any_errors_fatal 37031 1727204402.40692: done checking for any_errors_fatal 37031 1727204402.40692: checking for max_fail_percentage 37031 1727204402.40695: done checking for max_fail_percentage 37031 1727204402.40696: checking to see if all hosts have failed and the running result is not ok 37031 1727204402.40697: done checking to see if all hosts have failed 37031 1727204402.40697: getting the remaining hosts for this loop 37031 1727204402.40700: done getting the remaining hosts for this loop 37031 1727204402.40704: getting the next task for host managed-node2 37031 1727204402.40714: done getting next task for host managed-node2 37031 1727204402.40721: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 37031 1727204402.40725: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204402.40742: getting variables 37031 1727204402.40744: in VariableManager get_vars() 37031 1727204402.40795: Calling all_inventory to load vars for managed-node2 37031 1727204402.40798: Calling groups_inventory to load vars for managed-node2 37031 1727204402.40799: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204402.40808: Calling all_plugins_play to load vars for managed-node2 37031 1727204402.40810: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204402.40812: Calling groups_plugins_play to load vars for managed-node2 37031 1727204402.41646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204402.42700: done with get_vars() 37031 1727204402.42718: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:00:02 -0400 (0:00:00.048) 0:00:24.972 ***** 37031 1727204402.42799: entering _queue_task() for managed-node2/service_facts 37031 1727204402.43207: worker is 1 (out of 1 available) 37031 1727204402.43219: exiting _queue_task() for managed-node2/service_facts 37031 1727204402.43233: done queuing things up, now waiting for results queue to drain 37031 1727204402.43235: waiting for pending results... 37031 1727204402.43579: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 37031 1727204402.43782: in run() - task 0affcd87-79f5-b754-dfb8-000000000518 37031 1727204402.43810: variable 'ansible_search_path' from source: unknown 37031 1727204402.43824: variable 'ansible_search_path' from source: unknown 37031 1727204402.43878: calling self._execute() 37031 1727204402.43980: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.43983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.43992: variable 'omit' from source: magic vars 37031 1727204402.44409: variable 'ansible_distribution_major_version' from source: facts 37031 1727204402.44428: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204402.44442: variable 'omit' from source: magic vars 37031 1727204402.44534: variable 'omit' from source: magic vars 37031 1727204402.44582: variable 'omit' from source: magic vars 37031 1727204402.44634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204402.44678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204402.44704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204402.44733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204402.44750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204402.44788: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204402.44796: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.44802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.44908: Set connection var ansible_connection to ssh 37031 1727204402.44916: Set connection var ansible_shell_type to sh 37031 1727204402.44927: Set connection var ansible_pipelining to False 37031 1727204402.44946: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204402.44955: Set connection var ansible_timeout to 10 37031 1727204402.44971: Set connection var ansible_shell_executable to /bin/sh 37031 1727204402.45004: variable 'ansible_shell_executable' from source: unknown 37031 1727204402.45012: variable 'ansible_connection' from source: unknown 37031 1727204402.45019: variable 'ansible_module_compression' from source: unknown 37031 1727204402.45033: variable 'ansible_shell_type' from source: unknown 37031 1727204402.45051: variable 'ansible_shell_executable' from source: unknown 37031 1727204402.45069: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204402.45085: variable 'ansible_pipelining' from source: unknown 37031 1727204402.45093: variable 'ansible_timeout' from source: unknown 37031 1727204402.45101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204402.45425: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204402.45435: variable 'omit' from source: magic vars 37031 1727204402.45439: starting attempt loop 37031 1727204402.45442: running the handler 37031 1727204402.45454: _low_level_execute_command(): starting 37031 1727204402.45461: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204402.46004: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.46031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204402.46050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.46099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204402.46112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204402.46174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204402.47855: stdout chunk (state=3): >>>/root <<< 37031 1727204402.47973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204402.48084: stderr chunk (state=3): >>><<< 37031 1727204402.48102: stdout chunk (state=3): >>><<< 37031 1727204402.48144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204402.48170: _low_level_execute_command(): starting 37031 1727204402.48183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635 `" && echo ansible-tmp-1727204402.4815187-38548-261958474790635="` echo /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635 `" ) && sleep 0' 37031 1727204402.48855: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204402.48886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204402.48907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.48926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204402.49017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204402.49047: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204402.49063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.49095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204402.49108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204402.49155: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.49161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.49262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204402.49296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204402.51196: stdout chunk (state=3): >>>ansible-tmp-1727204402.4815187-38548-261958474790635=/root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635 <<< 37031 1727204402.51316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204402.51384: stderr chunk (state=3): >>><<< 37031 1727204402.51387: stdout chunk (state=3): >>><<< 37031 1727204402.51402: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204402.4815187-38548-261958474790635=/root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204402.51444: variable 'ansible_module_compression' from source: unknown 37031 1727204402.51485: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 37031 1727204402.51517: variable 'ansible_facts' from source: unknown 37031 1727204402.51580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/AnsiballZ_service_facts.py 37031 1727204402.51692: Sending initial data 37031 1727204402.51700: Sent initial data (162 bytes) 37031 1727204402.52432: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.52436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204402.52472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration <<< 37031 1727204402.52476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.52520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204402.52532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204402.52592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204402.54326: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204402.54355: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204402.54398: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpf1925wsq /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/AnsiballZ_service_facts.py <<< 37031 1727204402.54432: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204402.55253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204402.55375: stderr chunk (state=3): >>><<< 37031 1727204402.55379: stdout chunk (state=3): >>><<< 37031 1727204402.55391: done transferring module to remote 37031 1727204402.55401: _low_level_execute_command(): starting 37031 1727204402.55406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/ /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/AnsiballZ_service_facts.py && sleep 0' 37031 1727204402.55877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204402.55881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.55892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204402.55931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204402.55944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.55995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204402.56012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204402.56060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204402.57769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204402.57837: stderr chunk (state=3): >>><<< 37031 1727204402.57841: stdout chunk (state=3): >>><<< 37031 1727204402.57858: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204402.57862: _low_level_execute_command(): starting 37031 1727204402.57871: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/AnsiballZ_service_facts.py && sleep 0' 37031 1727204402.58345: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204402.58349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204402.58392: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.58407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204402.58456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204402.58471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204402.58528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204403.87612: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 37031 1727204403.87646: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 37031 1727204403.88895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204403.88976: stderr chunk (state=3): >>><<< 37031 1727204403.88980: stdout chunk (state=3): >>><<< 37031 1727204403.89185: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204403.89719: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204403.89733: _low_level_execute_command(): starting 37031 1727204403.89741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204402.4815187-38548-261958474790635/ > /dev/null 2>&1 && sleep 0' 37031 1727204403.90524: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204403.90528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204403.90562: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204403.90567: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204403.90570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204403.90636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204403.90651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204403.90746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204403.92519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204403.92605: stderr chunk (state=3): >>><<< 37031 1727204403.92608: stdout chunk (state=3): >>><<< 37031 1727204403.92664: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204403.92669: handler run complete 37031 1727204403.92782: variable 'ansible_facts' from source: unknown 37031 1727204403.93176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204403.93447: variable 'ansible_facts' from source: unknown 37031 1727204403.93904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204403.93907: attempt loop complete, returning result 37031 1727204403.93910: _execute() done 37031 1727204403.93912: dumping result to json 37031 1727204403.93914: done dumping result, returning 37031 1727204403.93917: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-b754-dfb8-000000000518] 37031 1727204403.93919: sending task result for task 0affcd87-79f5-b754-dfb8-000000000518 37031 1727204403.94463: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000518 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204403.94645: no more pending results, returning what we have 37031 1727204403.94648: results queue empty 37031 1727204403.94650: checking for any_errors_fatal 37031 1727204403.94654: done checking for any_errors_fatal 37031 1727204403.94655: checking for max_fail_percentage 37031 1727204403.94657: done checking for max_fail_percentage 37031 1727204403.94658: checking to see if all hosts have failed and the running result is not ok 37031 1727204403.94659: done checking to see if all hosts have failed 37031 1727204403.94659: getting the remaining hosts for this loop 37031 1727204403.94661: done getting the remaining hosts for this loop 37031 1727204403.94667: getting the next task for host managed-node2 37031 1727204403.94673: done getting next task for host managed-node2 37031 1727204403.94677: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 37031 1727204403.94682: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204403.94693: getting variables 37031 1727204403.94694: in VariableManager get_vars() 37031 1727204403.94727: Calling all_inventory to load vars for managed-node2 37031 1727204403.94729: Calling groups_inventory to load vars for managed-node2 37031 1727204403.94731: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204403.94740: Calling all_plugins_play to load vars for managed-node2 37031 1727204403.94742: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204403.94744: Calling groups_plugins_play to load vars for managed-node2 37031 1727204403.95329: WORKER PROCESS EXITING 37031 1727204403.96002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204403.97014: done with get_vars() 37031 1727204403.97033: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:00:03 -0400 (0:00:01.543) 0:00:26.515 ***** 37031 1727204403.97113: entering _queue_task() for managed-node2/package_facts 37031 1727204403.97609: worker is 1 (out of 1 available) 37031 1727204403.97619: exiting _queue_task() for managed-node2/package_facts 37031 1727204403.97631: done queuing things up, now waiting for results queue to drain 37031 1727204403.97632: waiting for pending results... 37031 1727204403.97679: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 37031 1727204403.97861: in run() - task 0affcd87-79f5-b754-dfb8-000000000519 37031 1727204403.97885: variable 'ansible_search_path' from source: unknown 37031 1727204403.97893: variable 'ansible_search_path' from source: unknown 37031 1727204403.97934: calling self._execute() 37031 1727204403.98035: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204403.98054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204403.98071: variable 'omit' from source: magic vars 37031 1727204403.98584: variable 'ansible_distribution_major_version' from source: facts 37031 1727204403.98609: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204403.98642: variable 'omit' from source: magic vars 37031 1727204403.98699: variable 'omit' from source: magic vars 37031 1727204403.98726: variable 'omit' from source: magic vars 37031 1727204403.98760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204403.98792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204403.98810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204403.98825: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204403.98835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204403.98862: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204403.98868: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204403.98872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204403.98938: Set connection var ansible_connection to ssh 37031 1727204403.98941: Set connection var ansible_shell_type to sh 37031 1727204403.98947: Set connection var ansible_pipelining to False 37031 1727204403.98954: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204403.98960: Set connection var ansible_timeout to 10 37031 1727204403.98966: Set connection var ansible_shell_executable to /bin/sh 37031 1727204403.98987: variable 'ansible_shell_executable' from source: unknown 37031 1727204403.98990: variable 'ansible_connection' from source: unknown 37031 1727204403.98993: variable 'ansible_module_compression' from source: unknown 37031 1727204403.98995: variable 'ansible_shell_type' from source: unknown 37031 1727204403.98997: variable 'ansible_shell_executable' from source: unknown 37031 1727204403.99000: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204403.99004: variable 'ansible_pipelining' from source: unknown 37031 1727204403.99006: variable 'ansible_timeout' from source: unknown 37031 1727204403.99010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204403.99158: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204403.99167: variable 'omit' from source: magic vars 37031 1727204403.99170: starting attempt loop 37031 1727204403.99173: running the handler 37031 1727204403.99185: _low_level_execute_command(): starting 37031 1727204403.99192: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204403.99718: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204403.99729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204403.99763: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204403.99779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204403.99829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204403.99842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204403.99892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.01433: stdout chunk (state=3): >>>/root <<< 37031 1727204404.01532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204404.01593: stderr chunk (state=3): >>><<< 37031 1727204404.01596: stdout chunk (state=3): >>><<< 37031 1727204404.01617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204404.01632: _low_level_execute_command(): starting 37031 1727204404.01637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818 `" && echo ansible-tmp-1727204404.0161705-38693-182293513765818="` echo /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818 `" ) && sleep 0' 37031 1727204404.02102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.02108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.02142: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.02155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.02212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204404.02229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204404.02278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.04441: stdout chunk (state=3): >>>ansible-tmp-1727204404.0161705-38693-182293513765818=/root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818 <<< 37031 1727204404.04569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204404.04620: stderr chunk (state=3): >>><<< 37031 1727204404.04624: stdout chunk (state=3): >>><<< 37031 1727204404.04641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204404.0161705-38693-182293513765818=/root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204404.04682: variable 'ansible_module_compression' from source: unknown 37031 1727204404.04726: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 37031 1727204404.04781: variable 'ansible_facts' from source: unknown 37031 1727204404.04932: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/AnsiballZ_package_facts.py 37031 1727204404.05070: Sending initial data 37031 1727204404.05074: Sent initial data (162 bytes) 37031 1727204404.05782: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204404.05801: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.05804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.05849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.05852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.05854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.05856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204404.05858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.05907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204404.05919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204404.05970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.07679: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204404.07716: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204404.07759: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpvbhwmube /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/AnsiballZ_package_facts.py <<< 37031 1727204404.08151: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204404.10420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204404.10551: stderr chunk (state=3): >>><<< 37031 1727204404.10558: stdout chunk (state=3): >>><<< 37031 1727204404.10580: done transferring module to remote 37031 1727204404.10593: _low_level_execute_command(): starting 37031 1727204404.10597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/ /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/AnsiballZ_package_facts.py && sleep 0' 37031 1727204404.11238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204404.11248: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.11262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.11276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.11316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.11323: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204404.11333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.11347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204404.11358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204404.11361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204404.11372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.11381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.11392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.11400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.11406: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204404.11416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.11490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204404.11509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204404.11522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204404.11602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.13483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204404.13488: stdout chunk (state=3): >>><<< 37031 1727204404.13493: stderr chunk (state=3): >>><<< 37031 1727204404.13511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204404.13514: _low_level_execute_command(): starting 37031 1727204404.13520: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/AnsiballZ_package_facts.py && sleep 0' 37031 1727204404.14175: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204404.14179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.14182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.14204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.14237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.14247: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204404.14252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.14269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204404.14274: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204404.14281: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204404.14289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.14299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.14311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.14318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.14324: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204404.14334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.14407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204404.14426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204404.14441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204404.14519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.61098: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "rel<<< 37031 1727204404.61144: stdout chunk (state=3): >>>ease": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-uti<<< 37031 1727204404.61175: stdout chunk (state=3): >>>ls", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch"<<< 37031 1727204404.61182: stdout chunk (state=3): >>>: "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-defa<<< 37031 1727204404.61185: stdout chunk (state=3): >>>ult-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x<<< 37031 1727204404.61222: stdout chunk (state=3): >>>86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 37031 1727204404.62685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204404.62780: stderr chunk (state=3): >>><<< 37031 1727204404.62784: stdout chunk (state=3): >>><<< 37031 1727204404.62818: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204404.65396: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204404.65418: _low_level_execute_command(): starting 37031 1727204404.65422: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204404.0161705-38693-182293513765818/ > /dev/null 2>&1 && sleep 0' 37031 1727204404.66152: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204404.66161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.66174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.66187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.66237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.66244: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204404.66254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.66270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204404.66277: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204404.66284: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204404.66292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204404.66302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204404.66322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204404.66330: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204404.66338: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204404.66351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204404.66430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204404.66450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204404.66468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204404.66541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204404.68358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204404.68459: stderr chunk (state=3): >>><<< 37031 1727204404.68463: stdout chunk (state=3): >>><<< 37031 1727204404.68487: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204404.68494: handler run complete 37031 1727204404.69903: variable 'ansible_facts' from source: unknown 37031 1727204404.70435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204404.72610: variable 'ansible_facts' from source: unknown 37031 1727204404.73033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204404.74491: attempt loop complete, returning result 37031 1727204404.74514: _execute() done 37031 1727204404.74522: dumping result to json 37031 1727204404.74769: done dumping result, returning 37031 1727204404.74785: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-b754-dfb8-000000000519] 37031 1727204404.74796: sending task result for task 0affcd87-79f5-b754-dfb8-000000000519 37031 1727204404.77723: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000519 37031 1727204404.77727: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204404.77933: no more pending results, returning what we have 37031 1727204404.77936: results queue empty 37031 1727204404.77938: checking for any_errors_fatal 37031 1727204404.77946: done checking for any_errors_fatal 37031 1727204404.77947: checking for max_fail_percentage 37031 1727204404.77949: done checking for max_fail_percentage 37031 1727204404.77950: checking to see if all hosts have failed and the running result is not ok 37031 1727204404.77951: done checking to see if all hosts have failed 37031 1727204404.77951: getting the remaining hosts for this loop 37031 1727204404.77953: done getting the remaining hosts for this loop 37031 1727204404.77960: getting the next task for host managed-node2 37031 1727204404.77971: done getting next task for host managed-node2 37031 1727204404.77976: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 37031 1727204404.77979: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204404.77992: getting variables 37031 1727204404.77994: in VariableManager get_vars() 37031 1727204404.78033: Calling all_inventory to load vars for managed-node2 37031 1727204404.78037: Calling groups_inventory to load vars for managed-node2 37031 1727204404.78040: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204404.78051: Calling all_plugins_play to load vars for managed-node2 37031 1727204404.78053: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204404.78059: Calling groups_plugins_play to load vars for managed-node2 37031 1727204404.80330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204404.83117: done with get_vars() 37031 1727204404.83147: done getting variables 37031 1727204404.83219: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:00:04 -0400 (0:00:00.863) 0:00:27.379 ***** 37031 1727204404.83470: entering _queue_task() for managed-node2/debug 37031 1727204404.83788: worker is 1 (out of 1 available) 37031 1727204404.83801: exiting _queue_task() for managed-node2/debug 37031 1727204404.83814: done queuing things up, now waiting for results queue to drain 37031 1727204404.83815: waiting for pending results... 37031 1727204404.84126: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 37031 1727204404.84284: in run() - task 0affcd87-79f5-b754-dfb8-00000000006f 37031 1727204404.84305: variable 'ansible_search_path' from source: unknown 37031 1727204404.84313: variable 'ansible_search_path' from source: unknown 37031 1727204404.84351: calling self._execute() 37031 1727204404.84451: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204404.84468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204404.84486: variable 'omit' from source: magic vars 37031 1727204404.84975: variable 'ansible_distribution_major_version' from source: facts 37031 1727204404.84992: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204404.85003: variable 'omit' from source: magic vars 37031 1727204404.85073: variable 'omit' from source: magic vars 37031 1727204404.85249: variable 'network_provider' from source: set_fact 37031 1727204404.85279: variable 'omit' from source: magic vars 37031 1727204404.85325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204404.85374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204404.85400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204404.85422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204404.85437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204404.85481: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204404.85491: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204404.85499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204404.85593: Set connection var ansible_connection to ssh 37031 1727204404.85600: Set connection var ansible_shell_type to sh 37031 1727204404.85609: Set connection var ansible_pipelining to False 37031 1727204404.85619: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204404.85626: Set connection var ansible_timeout to 10 37031 1727204404.85634: Set connection var ansible_shell_executable to /bin/sh 37031 1727204404.85672: variable 'ansible_shell_executable' from source: unknown 37031 1727204404.85680: variable 'ansible_connection' from source: unknown 37031 1727204404.85685: variable 'ansible_module_compression' from source: unknown 37031 1727204404.85690: variable 'ansible_shell_type' from source: unknown 37031 1727204404.85695: variable 'ansible_shell_executable' from source: unknown 37031 1727204404.85700: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204404.85705: variable 'ansible_pipelining' from source: unknown 37031 1727204404.85710: variable 'ansible_timeout' from source: unknown 37031 1727204404.85715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204404.85851: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204404.85988: variable 'omit' from source: magic vars 37031 1727204404.86004: starting attempt loop 37031 1727204404.86011: running the handler 37031 1727204404.86067: handler run complete 37031 1727204404.86138: attempt loop complete, returning result 37031 1727204404.86145: _execute() done 37031 1727204404.86153: dumping result to json 37031 1727204404.86167: done dumping result, returning 37031 1727204404.86179: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-b754-dfb8-00000000006f] 37031 1727204404.86188: sending task result for task 0affcd87-79f5-b754-dfb8-00000000006f ok: [managed-node2] => {} MSG: Using network provider: nm 37031 1727204404.86359: no more pending results, returning what we have 37031 1727204404.86365: results queue empty 37031 1727204404.86368: checking for any_errors_fatal 37031 1727204404.86378: done checking for any_errors_fatal 37031 1727204404.86379: checking for max_fail_percentage 37031 1727204404.86381: done checking for max_fail_percentage 37031 1727204404.86382: checking to see if all hosts have failed and the running result is not ok 37031 1727204404.86383: done checking to see if all hosts have failed 37031 1727204404.86384: getting the remaining hosts for this loop 37031 1727204404.86386: done getting the remaining hosts for this loop 37031 1727204404.86390: getting the next task for host managed-node2 37031 1727204404.86398: done getting next task for host managed-node2 37031 1727204404.86402: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 37031 1727204404.86407: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204404.86421: getting variables 37031 1727204404.86423: in VariableManager get_vars() 37031 1727204404.86479: Calling all_inventory to load vars for managed-node2 37031 1727204404.86482: Calling groups_inventory to load vars for managed-node2 37031 1727204404.86485: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204404.86495: Calling all_plugins_play to load vars for managed-node2 37031 1727204404.86498: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204404.86502: Calling groups_plugins_play to load vars for managed-node2 37031 1727204404.87871: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000006f 37031 1727204404.87874: WORKER PROCESS EXITING 37031 1727204404.88538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204404.90599: done with get_vars() 37031 1727204404.90630: done getting variables 37031 1727204404.90692: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:00:04 -0400 (0:00:00.072) 0:00:27.452 ***** 37031 1727204404.90726: entering _queue_task() for managed-node2/fail 37031 1727204404.91058: worker is 1 (out of 1 available) 37031 1727204404.91179: exiting _queue_task() for managed-node2/fail 37031 1727204404.91192: done queuing things up, now waiting for results queue to drain 37031 1727204404.91194: waiting for pending results... 37031 1727204404.92216: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 37031 1727204404.92473: in run() - task 0affcd87-79f5-b754-dfb8-000000000070 37031 1727204404.92589: variable 'ansible_search_path' from source: unknown 37031 1727204404.92648: variable 'ansible_search_path' from source: unknown 37031 1727204404.92695: calling self._execute() 37031 1727204404.92943: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204404.92954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204404.93011: variable 'omit' from source: magic vars 37031 1727204404.93833: variable 'ansible_distribution_major_version' from source: facts 37031 1727204404.93966: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204404.94207: variable 'network_state' from source: role '' defaults 37031 1727204404.94223: Evaluated conditional (network_state != {}): False 37031 1727204404.94231: when evaluation is False, skipping this task 37031 1727204404.94282: _execute() done 37031 1727204404.94294: dumping result to json 37031 1727204404.94303: done dumping result, returning 37031 1727204404.94315: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-b754-dfb8-000000000070] 37031 1727204404.94332: sending task result for task 0affcd87-79f5-b754-dfb8-000000000070 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204404.94537: no more pending results, returning what we have 37031 1727204404.94542: results queue empty 37031 1727204404.94543: checking for any_errors_fatal 37031 1727204404.94554: done checking for any_errors_fatal 37031 1727204404.94555: checking for max_fail_percentage 37031 1727204404.94557: done checking for max_fail_percentage 37031 1727204404.94558: checking to see if all hosts have failed and the running result is not ok 37031 1727204404.94559: done checking to see if all hosts have failed 37031 1727204404.94560: getting the remaining hosts for this loop 37031 1727204404.94561: done getting the remaining hosts for this loop 37031 1727204404.94567: getting the next task for host managed-node2 37031 1727204404.94574: done getting next task for host managed-node2 37031 1727204404.94578: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 37031 1727204404.94582: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204404.94608: getting variables 37031 1727204404.94610: in VariableManager get_vars() 37031 1727204404.94656: Calling all_inventory to load vars for managed-node2 37031 1727204404.94659: Calling groups_inventory to load vars for managed-node2 37031 1727204404.94661: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204404.94675: Calling all_plugins_play to load vars for managed-node2 37031 1727204404.94678: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204404.94681: Calling groups_plugins_play to load vars for managed-node2 37031 1727204404.95377: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000070 37031 1727204404.95381: WORKER PROCESS EXITING 37031 1727204404.97297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.06019: done with get_vars() 37031 1727204405.06051: done getting variables 37031 1727204405.06107: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.154) 0:00:27.606 ***** 37031 1727204405.06137: entering _queue_task() for managed-node2/fail 37031 1727204405.06490: worker is 1 (out of 1 available) 37031 1727204405.06508: exiting _queue_task() for managed-node2/fail 37031 1727204405.06521: done queuing things up, now waiting for results queue to drain 37031 1727204405.06523: waiting for pending results... 37031 1727204405.06831: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 37031 1727204405.06978: in run() - task 0affcd87-79f5-b754-dfb8-000000000071 37031 1727204405.06992: variable 'ansible_search_path' from source: unknown 37031 1727204405.06996: variable 'ansible_search_path' from source: unknown 37031 1727204405.07030: calling self._execute() 37031 1727204405.07131: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.07137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.07145: variable 'omit' from source: magic vars 37031 1727204405.07542: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.07554: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.07688: variable 'network_state' from source: role '' defaults 37031 1727204405.07704: Evaluated conditional (network_state != {}): False 37031 1727204405.07708: when evaluation is False, skipping this task 37031 1727204405.07710: _execute() done 37031 1727204405.07713: dumping result to json 37031 1727204405.07716: done dumping result, returning 37031 1727204405.07730: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-b754-dfb8-000000000071] 37031 1727204405.07733: sending task result for task 0affcd87-79f5-b754-dfb8-000000000071 37031 1727204405.07830: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000071 37031 1727204405.07833: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204405.07886: no more pending results, returning what we have 37031 1727204405.07891: results queue empty 37031 1727204405.07891: checking for any_errors_fatal 37031 1727204405.07900: done checking for any_errors_fatal 37031 1727204405.07901: checking for max_fail_percentage 37031 1727204405.07903: done checking for max_fail_percentage 37031 1727204405.07904: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.07905: done checking to see if all hosts have failed 37031 1727204405.07906: getting the remaining hosts for this loop 37031 1727204405.07908: done getting the remaining hosts for this loop 37031 1727204405.07914: getting the next task for host managed-node2 37031 1727204405.07922: done getting next task for host managed-node2 37031 1727204405.07927: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 37031 1727204405.07930: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.07950: getting variables 37031 1727204405.07953: in VariableManager get_vars() 37031 1727204405.08001: Calling all_inventory to load vars for managed-node2 37031 1727204405.08004: Calling groups_inventory to load vars for managed-node2 37031 1727204405.08007: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.08019: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.08022: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.08026: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.09675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.11892: done with get_vars() 37031 1727204405.11922: done getting variables 37031 1727204405.12099: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.059) 0:00:27.666 ***** 37031 1727204405.12133: entering _queue_task() for managed-node2/fail 37031 1727204405.12819: worker is 1 (out of 1 available) 37031 1727204405.12875: exiting _queue_task() for managed-node2/fail 37031 1727204405.12888: done queuing things up, now waiting for results queue to drain 37031 1727204405.12890: waiting for pending results... 37031 1727204405.13796: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 37031 1727204405.13934: in run() - task 0affcd87-79f5-b754-dfb8-000000000072 37031 1727204405.13947: variable 'ansible_search_path' from source: unknown 37031 1727204405.13950: variable 'ansible_search_path' from source: unknown 37031 1727204405.13990: calling self._execute() 37031 1727204405.14090: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.14094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.14104: variable 'omit' from source: magic vars 37031 1727204405.14493: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.14504: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.14690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.17368: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.17796: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.17841: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.17876: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.17903: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.17989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.18018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.18044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.18088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.18100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.18328: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.18332: Evaluated conditional (ansible_distribution_major_version | int > 9): False 37031 1727204405.18334: when evaluation is False, skipping this task 37031 1727204405.18337: _execute() done 37031 1727204405.18339: dumping result to json 37031 1727204405.18341: done dumping result, returning 37031 1727204405.18344: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-b754-dfb8-000000000072] 37031 1727204405.18346: sending task result for task 0affcd87-79f5-b754-dfb8-000000000072 37031 1727204405.18419: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000072 37031 1727204405.18422: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 37031 1727204405.18471: no more pending results, returning what we have 37031 1727204405.18476: results queue empty 37031 1727204405.18476: checking for any_errors_fatal 37031 1727204405.18481: done checking for any_errors_fatal 37031 1727204405.18482: checking for max_fail_percentage 37031 1727204405.18486: done checking for max_fail_percentage 37031 1727204405.18487: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.18488: done checking to see if all hosts have failed 37031 1727204405.18489: getting the remaining hosts for this loop 37031 1727204405.18491: done getting the remaining hosts for this loop 37031 1727204405.18495: getting the next task for host managed-node2 37031 1727204405.18503: done getting next task for host managed-node2 37031 1727204405.18509: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 37031 1727204405.18512: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.18530: getting variables 37031 1727204405.18531: in VariableManager get_vars() 37031 1727204405.18594: Calling all_inventory to load vars for managed-node2 37031 1727204405.18597: Calling groups_inventory to load vars for managed-node2 37031 1727204405.18599: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.18607: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.18609: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.18612: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.20672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.22481: done with get_vars() 37031 1727204405.22620: done getting variables 37031 1727204405.22683: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.106) 0:00:27.773 ***** 37031 1727204405.22831: entering _queue_task() for managed-node2/dnf 37031 1727204405.23382: worker is 1 (out of 1 available) 37031 1727204405.23395: exiting _queue_task() for managed-node2/dnf 37031 1727204405.23408: done queuing things up, now waiting for results queue to drain 37031 1727204405.23410: waiting for pending results... 37031 1727204405.23723: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 37031 1727204405.23862: in run() - task 0affcd87-79f5-b754-dfb8-000000000073 37031 1727204405.23874: variable 'ansible_search_path' from source: unknown 37031 1727204405.23877: variable 'ansible_search_path' from source: unknown 37031 1727204405.23919: calling self._execute() 37031 1727204405.24029: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.24035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.24045: variable 'omit' from source: magic vars 37031 1727204405.24439: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.24459: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.24668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.28128: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.28520: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.28563: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.28642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.28674: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.28784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.28970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.28974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.29017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.29020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.29136: variable 'ansible_distribution' from source: facts 37031 1727204405.29139: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.29158: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 37031 1727204405.29283: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.29455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.29481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.29507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.29555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.29569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.29611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.29634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.29667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.29710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.29726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.29775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.29797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.29822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.29870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.29882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.30038: variable 'network_connections' from source: task vars 37031 1727204405.30050: variable 'interface' from source: play vars 37031 1727204405.30122: variable 'interface' from source: play vars 37031 1727204405.30197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204405.30385: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204405.30428: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204405.30459: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204405.30486: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204405.30538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204405.30561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204405.30586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.30612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204405.30663: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204405.31027: variable 'network_connections' from source: task vars 37031 1727204405.31031: variable 'interface' from source: play vars 37031 1727204405.31098: variable 'interface' from source: play vars 37031 1727204405.31122: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204405.31125: when evaluation is False, skipping this task 37031 1727204405.31128: _execute() done 37031 1727204405.31130: dumping result to json 37031 1727204405.31134: done dumping result, returning 37031 1727204405.31142: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-000000000073] 37031 1727204405.31147: sending task result for task 0affcd87-79f5-b754-dfb8-000000000073 37031 1727204405.31259: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000073 37031 1727204405.31262: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204405.31340: no more pending results, returning what we have 37031 1727204405.31344: results queue empty 37031 1727204405.31345: checking for any_errors_fatal 37031 1727204405.31352: done checking for any_errors_fatal 37031 1727204405.31353: checking for max_fail_percentage 37031 1727204405.31355: done checking for max_fail_percentage 37031 1727204405.31356: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.31357: done checking to see if all hosts have failed 37031 1727204405.31357: getting the remaining hosts for this loop 37031 1727204405.31359: done getting the remaining hosts for this loop 37031 1727204405.31363: getting the next task for host managed-node2 37031 1727204405.31372: done getting next task for host managed-node2 37031 1727204405.31377: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 37031 1727204405.31380: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.31404: getting variables 37031 1727204405.31406: in VariableManager get_vars() 37031 1727204405.31476: Calling all_inventory to load vars for managed-node2 37031 1727204405.31479: Calling groups_inventory to load vars for managed-node2 37031 1727204405.31481: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.31491: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.31494: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.31498: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.33468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.36234: done with get_vars() 37031 1727204405.36270: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 37031 1727204405.36349: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.135) 0:00:27.908 ***** 37031 1727204405.36387: entering _queue_task() for managed-node2/yum 37031 1727204405.36723: worker is 1 (out of 1 available) 37031 1727204405.36736: exiting _queue_task() for managed-node2/yum 37031 1727204405.36747: done queuing things up, now waiting for results queue to drain 37031 1727204405.36748: waiting for pending results... 37031 1727204405.37058: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 37031 1727204405.37178: in run() - task 0affcd87-79f5-b754-dfb8-000000000074 37031 1727204405.37194: variable 'ansible_search_path' from source: unknown 37031 1727204405.37198: variable 'ansible_search_path' from source: unknown 37031 1727204405.37234: calling self._execute() 37031 1727204405.37332: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.37335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.37345: variable 'omit' from source: magic vars 37031 1727204405.37742: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.37753: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.37943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.40771: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.40845: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.40901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.40934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.40967: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.41053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.41084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.41116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.41162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.41177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.41284: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.41300: Evaluated conditional (ansible_distribution_major_version | int < 8): False 37031 1727204405.41303: when evaluation is False, skipping this task 37031 1727204405.41306: _execute() done 37031 1727204405.41309: dumping result to json 37031 1727204405.41311: done dumping result, returning 37031 1727204405.41321: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-000000000074] 37031 1727204405.41331: sending task result for task 0affcd87-79f5-b754-dfb8-000000000074 37031 1727204405.41429: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000074 37031 1727204405.41432: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 37031 1727204405.41488: no more pending results, returning what we have 37031 1727204405.41493: results queue empty 37031 1727204405.41494: checking for any_errors_fatal 37031 1727204405.41499: done checking for any_errors_fatal 37031 1727204405.41500: checking for max_fail_percentage 37031 1727204405.41502: done checking for max_fail_percentage 37031 1727204405.41503: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.41504: done checking to see if all hosts have failed 37031 1727204405.41505: getting the remaining hosts for this loop 37031 1727204405.41507: done getting the remaining hosts for this loop 37031 1727204405.41512: getting the next task for host managed-node2 37031 1727204405.41519: done getting next task for host managed-node2 37031 1727204405.41524: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 37031 1727204405.41527: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.41546: getting variables 37031 1727204405.41548: in VariableManager get_vars() 37031 1727204405.41595: Calling all_inventory to load vars for managed-node2 37031 1727204405.41599: Calling groups_inventory to load vars for managed-node2 37031 1727204405.41601: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.41611: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.41614: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.41617: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.43433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.45110: done with get_vars() 37031 1727204405.45143: done getting variables 37031 1727204405.45207: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.088) 0:00:27.997 ***** 37031 1727204405.45247: entering _queue_task() for managed-node2/fail 37031 1727204405.45595: worker is 1 (out of 1 available) 37031 1727204405.45607: exiting _queue_task() for managed-node2/fail 37031 1727204405.45619: done queuing things up, now waiting for results queue to drain 37031 1727204405.45621: waiting for pending results... 37031 1727204405.45938: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 37031 1727204405.46070: in run() - task 0affcd87-79f5-b754-dfb8-000000000075 37031 1727204405.46089: variable 'ansible_search_path' from source: unknown 37031 1727204405.46092: variable 'ansible_search_path' from source: unknown 37031 1727204405.46135: calling self._execute() 37031 1727204405.46237: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.46241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.46251: variable 'omit' from source: magic vars 37031 1727204405.46658: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.46679: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.46807: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.47018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.49610: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.49692: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.49738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.49778: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.49807: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.49891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.49920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.49951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.49996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.50014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.50065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.50097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.50121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.50166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.50184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.50224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.50247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.50277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.50319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.50333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.50525: variable 'network_connections' from source: task vars 37031 1727204405.50537: variable 'interface' from source: play vars 37031 1727204405.50613: variable 'interface' from source: play vars 37031 1727204405.50694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204405.50872: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204405.50923: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204405.50958: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204405.50985: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204405.51031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204405.51059: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204405.51083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.51110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204405.51165: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204405.51422: variable 'network_connections' from source: task vars 37031 1727204405.51427: variable 'interface' from source: play vars 37031 1727204405.51497: variable 'interface' from source: play vars 37031 1727204405.51862: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204405.51868: when evaluation is False, skipping this task 37031 1727204405.51871: _execute() done 37031 1727204405.51873: dumping result to json 37031 1727204405.51875: done dumping result, returning 37031 1727204405.51882: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-000000000075] 37031 1727204405.51887: sending task result for task 0affcd87-79f5-b754-dfb8-000000000075 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204405.52032: no more pending results, returning what we have 37031 1727204405.52037: results queue empty 37031 1727204405.52038: checking for any_errors_fatal 37031 1727204405.52047: done checking for any_errors_fatal 37031 1727204405.52048: checking for max_fail_percentage 37031 1727204405.52050: done checking for max_fail_percentage 37031 1727204405.52051: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.52052: done checking to see if all hosts have failed 37031 1727204405.52053: getting the remaining hosts for this loop 37031 1727204405.52055: done getting the remaining hosts for this loop 37031 1727204405.52059: getting the next task for host managed-node2 37031 1727204405.52069: done getting next task for host managed-node2 37031 1727204405.52074: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 37031 1727204405.52077: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.52098: getting variables 37031 1727204405.52101: in VariableManager get_vars() 37031 1727204405.52144: Calling all_inventory to load vars for managed-node2 37031 1727204405.52147: Calling groups_inventory to load vars for managed-node2 37031 1727204405.52149: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.52162: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.52167: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.52171: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.52794: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000075 37031 1727204405.52798: WORKER PROCESS EXITING 37031 1727204405.53786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.54939: done with get_vars() 37031 1727204405.54962: done getting variables 37031 1727204405.55030: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.098) 0:00:28.095 ***** 37031 1727204405.55066: entering _queue_task() for managed-node2/package 37031 1727204405.55393: worker is 1 (out of 1 available) 37031 1727204405.55406: exiting _queue_task() for managed-node2/package 37031 1727204405.55422: done queuing things up, now waiting for results queue to drain 37031 1727204405.55424: waiting for pending results... 37031 1727204405.55726: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 37031 1727204405.55860: in run() - task 0affcd87-79f5-b754-dfb8-000000000076 37031 1727204405.55878: variable 'ansible_search_path' from source: unknown 37031 1727204405.55881: variable 'ansible_search_path' from source: unknown 37031 1727204405.55916: calling self._execute() 37031 1727204405.56068: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.56088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.56136: variable 'omit' from source: magic vars 37031 1727204405.56536: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.56553: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.56774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204405.57048: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204405.57102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204405.57140: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204405.57223: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204405.57343: variable 'network_packages' from source: role '' defaults 37031 1727204405.57462: variable '__network_provider_setup' from source: role '' defaults 37031 1727204405.57481: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204405.57554: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204405.57573: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204405.57641: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204405.57838: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.60977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.61023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.61052: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.61080: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.61104: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.61198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.61241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.61277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.61321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.61345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.61398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.61425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.61463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.61509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.61530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.61791: variable '__network_packages_default_gobject_packages' from source: role '' defaults 37031 1727204405.61912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.61940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.61974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.62021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.62043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.62146: variable 'ansible_python' from source: facts 37031 1727204405.62183: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 37031 1727204405.62279: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204405.62370: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204405.62552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.62580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.62622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.62676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.62688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.63012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.63023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.63026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.63029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.63031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.63034: variable 'network_connections' from source: task vars 37031 1727204405.63036: variable 'interface' from source: play vars 37031 1727204405.63315: variable 'interface' from source: play vars 37031 1727204405.63319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204405.63321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204405.63323: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.63326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204405.63672: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.63675: variable 'network_connections' from source: task vars 37031 1727204405.63678: variable 'interface' from source: play vars 37031 1727204405.64252: variable 'interface' from source: play vars 37031 1727204405.64258: variable '__network_packages_default_wireless' from source: role '' defaults 37031 1727204405.64261: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.64268: variable 'network_connections' from source: task vars 37031 1727204405.64271: variable 'interface' from source: play vars 37031 1727204405.64325: variable 'interface' from source: play vars 37031 1727204405.64346: variable '__network_packages_default_team' from source: role '' defaults 37031 1727204405.64745: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204405.65097: variable 'network_connections' from source: task vars 37031 1727204405.65107: variable 'interface' from source: play vars 37031 1727204405.65350: variable 'interface' from source: play vars 37031 1727204405.65422: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204405.65534: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204405.65578: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204405.65646: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204405.65992: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 37031 1727204405.66841: variable 'network_connections' from source: task vars 37031 1727204405.66845: variable 'interface' from source: play vars 37031 1727204405.66919: variable 'interface' from source: play vars 37031 1727204405.66929: variable 'ansible_distribution' from source: facts 37031 1727204405.66932: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.66937: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.66951: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 37031 1727204405.67126: variable 'ansible_distribution' from source: facts 37031 1727204405.67129: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.67131: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.67153: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 37031 1727204405.67321: variable 'ansible_distribution' from source: facts 37031 1727204405.67324: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.67330: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.67368: variable 'network_provider' from source: set_fact 37031 1727204405.67384: variable 'ansible_facts' from source: unknown 37031 1727204405.67908: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 37031 1727204405.67912: when evaluation is False, skipping this task 37031 1727204405.67914: _execute() done 37031 1727204405.67918: dumping result to json 37031 1727204405.67920: done dumping result, returning 37031 1727204405.67928: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-b754-dfb8-000000000076] 37031 1727204405.67931: sending task result for task 0affcd87-79f5-b754-dfb8-000000000076 37031 1727204405.68025: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000076 37031 1727204405.68029: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 37031 1727204405.68082: no more pending results, returning what we have 37031 1727204405.68085: results queue empty 37031 1727204405.68086: checking for any_errors_fatal 37031 1727204405.68094: done checking for any_errors_fatal 37031 1727204405.68094: checking for max_fail_percentage 37031 1727204405.68096: done checking for max_fail_percentage 37031 1727204405.68097: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.68098: done checking to see if all hosts have failed 37031 1727204405.68099: getting the remaining hosts for this loop 37031 1727204405.68100: done getting the remaining hosts for this loop 37031 1727204405.68104: getting the next task for host managed-node2 37031 1727204405.68110: done getting next task for host managed-node2 37031 1727204405.68114: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 37031 1727204405.68117: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.68134: getting variables 37031 1727204405.68136: in VariableManager get_vars() 37031 1727204405.68181: Calling all_inventory to load vars for managed-node2 37031 1727204405.68184: Calling groups_inventory to load vars for managed-node2 37031 1727204405.68185: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.68194: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.68196: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.68199: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.70181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.71692: done with get_vars() 37031 1727204405.71715: done getting variables 37031 1727204405.71767: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.167) 0:00:28.262 ***** 37031 1727204405.71792: entering _queue_task() for managed-node2/package 37031 1727204405.72029: worker is 1 (out of 1 available) 37031 1727204405.72044: exiting _queue_task() for managed-node2/package 37031 1727204405.72059: done queuing things up, now waiting for results queue to drain 37031 1727204405.72061: waiting for pending results... 37031 1727204405.72253: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 37031 1727204405.72348: in run() - task 0affcd87-79f5-b754-dfb8-000000000077 37031 1727204405.72358: variable 'ansible_search_path' from source: unknown 37031 1727204405.72364: variable 'ansible_search_path' from source: unknown 37031 1727204405.72397: calling self._execute() 37031 1727204405.72471: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.72476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.72483: variable 'omit' from source: magic vars 37031 1727204405.72776: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.72786: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.73062: variable 'network_state' from source: role '' defaults 37031 1727204405.73066: Evaluated conditional (network_state != {}): False 37031 1727204405.73068: when evaluation is False, skipping this task 37031 1727204405.73070: _execute() done 37031 1727204405.73072: dumping result to json 37031 1727204405.73073: done dumping result, returning 37031 1727204405.73076: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-b754-dfb8-000000000077] 37031 1727204405.73079: sending task result for task 0affcd87-79f5-b754-dfb8-000000000077 37031 1727204405.73143: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000077 37031 1727204405.73146: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204405.73190: no more pending results, returning what we have 37031 1727204405.73193: results queue empty 37031 1727204405.73194: checking for any_errors_fatal 37031 1727204405.73198: done checking for any_errors_fatal 37031 1727204405.73199: checking for max_fail_percentage 37031 1727204405.73201: done checking for max_fail_percentage 37031 1727204405.73202: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.73203: done checking to see if all hosts have failed 37031 1727204405.73203: getting the remaining hosts for this loop 37031 1727204405.73205: done getting the remaining hosts for this loop 37031 1727204405.73208: getting the next task for host managed-node2 37031 1727204405.73213: done getting next task for host managed-node2 37031 1727204405.73217: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 37031 1727204405.73220: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.73235: getting variables 37031 1727204405.73237: in VariableManager get_vars() 37031 1727204405.73283: Calling all_inventory to load vars for managed-node2 37031 1727204405.73301: Calling groups_inventory to load vars for managed-node2 37031 1727204405.73304: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.73312: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.73315: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.73318: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.74617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.75527: done with get_vars() 37031 1727204405.75543: done getting variables 37031 1727204405.75595: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.038) 0:00:28.301 ***** 37031 1727204405.75619: entering _queue_task() for managed-node2/package 37031 1727204405.75850: worker is 1 (out of 1 available) 37031 1727204405.75868: exiting _queue_task() for managed-node2/package 37031 1727204405.75880: done queuing things up, now waiting for results queue to drain 37031 1727204405.75881: waiting for pending results... 37031 1727204405.76075: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 37031 1727204405.76161: in run() - task 0affcd87-79f5-b754-dfb8-000000000078 37031 1727204405.76172: variable 'ansible_search_path' from source: unknown 37031 1727204405.76175: variable 'ansible_search_path' from source: unknown 37031 1727204405.76204: calling self._execute() 37031 1727204405.76277: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.76282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.76289: variable 'omit' from source: magic vars 37031 1727204405.76564: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.76575: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.76659: variable 'network_state' from source: role '' defaults 37031 1727204405.76669: Evaluated conditional (network_state != {}): False 37031 1727204405.76672: when evaluation is False, skipping this task 37031 1727204405.76674: _execute() done 37031 1727204405.76677: dumping result to json 37031 1727204405.76679: done dumping result, returning 37031 1727204405.76685: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-b754-dfb8-000000000078] 37031 1727204405.76690: sending task result for task 0affcd87-79f5-b754-dfb8-000000000078 37031 1727204405.76789: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000078 37031 1727204405.76791: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204405.76836: no more pending results, returning what we have 37031 1727204405.76840: results queue empty 37031 1727204405.76841: checking for any_errors_fatal 37031 1727204405.76847: done checking for any_errors_fatal 37031 1727204405.76848: checking for max_fail_percentage 37031 1727204405.76850: done checking for max_fail_percentage 37031 1727204405.76851: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.76852: done checking to see if all hosts have failed 37031 1727204405.76853: getting the remaining hosts for this loop 37031 1727204405.76854: done getting the remaining hosts for this loop 37031 1727204405.76861: getting the next task for host managed-node2 37031 1727204405.76868: done getting next task for host managed-node2 37031 1727204405.76872: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 37031 1727204405.76879: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.76895: getting variables 37031 1727204405.76897: in VariableManager get_vars() 37031 1727204405.76935: Calling all_inventory to load vars for managed-node2 37031 1727204405.76938: Calling groups_inventory to load vars for managed-node2 37031 1727204405.76940: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.76948: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.76950: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.76953: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.77749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.78686: done with get_vars() 37031 1727204405.78705: done getting variables 37031 1727204405.78751: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.031) 0:00:28.332 ***** 37031 1727204405.78781: entering _queue_task() for managed-node2/service 37031 1727204405.79016: worker is 1 (out of 1 available) 37031 1727204405.79031: exiting _queue_task() for managed-node2/service 37031 1727204405.79043: done queuing things up, now waiting for results queue to drain 37031 1727204405.79045: waiting for pending results... 37031 1727204405.79237: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 37031 1727204405.79329: in run() - task 0affcd87-79f5-b754-dfb8-000000000079 37031 1727204405.79341: variable 'ansible_search_path' from source: unknown 37031 1727204405.79344: variable 'ansible_search_path' from source: unknown 37031 1727204405.79377: calling self._execute() 37031 1727204405.79447: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.79452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.79461: variable 'omit' from source: magic vars 37031 1727204405.79743: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.79753: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.79838: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.79973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.81901: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.81960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.82185: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.82220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.82247: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.82335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.82367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.82394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.82436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.82455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.82499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.82523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.82548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.82590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.82605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.82711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.82714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.82717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.82760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.82772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.82944: variable 'network_connections' from source: task vars 37031 1727204405.82955: variable 'interface' from source: play vars 37031 1727204405.83030: variable 'interface' from source: play vars 37031 1727204405.83270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204405.83435: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204405.83476: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204405.83508: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204405.83616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204405.83619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204405.83641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204405.83673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.83693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204405.83739: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204405.83986: variable 'network_connections' from source: task vars 37031 1727204405.83991: variable 'interface' from source: play vars 37031 1727204405.84061: variable 'interface' from source: play vars 37031 1727204405.84087: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 37031 1727204405.84090: when evaluation is False, skipping this task 37031 1727204405.84093: _execute() done 37031 1727204405.84095: dumping result to json 37031 1727204405.84100: done dumping result, returning 37031 1727204405.84109: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-b754-dfb8-000000000079] 37031 1727204405.84112: sending task result for task 0affcd87-79f5-b754-dfb8-000000000079 37031 1727204405.84216: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000079 37031 1727204405.84225: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 37031 1727204405.84278: no more pending results, returning what we have 37031 1727204405.84282: results queue empty 37031 1727204405.84283: checking for any_errors_fatal 37031 1727204405.84289: done checking for any_errors_fatal 37031 1727204405.84290: checking for max_fail_percentage 37031 1727204405.84292: done checking for max_fail_percentage 37031 1727204405.84293: checking to see if all hosts have failed and the running result is not ok 37031 1727204405.84294: done checking to see if all hosts have failed 37031 1727204405.84295: getting the remaining hosts for this loop 37031 1727204405.84297: done getting the remaining hosts for this loop 37031 1727204405.84302: getting the next task for host managed-node2 37031 1727204405.84308: done getting next task for host managed-node2 37031 1727204405.84313: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 37031 1727204405.84316: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204405.84335: getting variables 37031 1727204405.84338: in VariableManager get_vars() 37031 1727204405.84396: Calling all_inventory to load vars for managed-node2 37031 1727204405.84399: Calling groups_inventory to load vars for managed-node2 37031 1727204405.84402: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204405.84412: Calling all_plugins_play to load vars for managed-node2 37031 1727204405.84415: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204405.84418: Calling groups_plugins_play to load vars for managed-node2 37031 1727204405.85693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204405.86603: done with get_vars() 37031 1727204405.86621: done getting variables 37031 1727204405.86672: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:00:05 -0400 (0:00:00.079) 0:00:28.411 ***** 37031 1727204405.86695: entering _queue_task() for managed-node2/service 37031 1727204405.86933: worker is 1 (out of 1 available) 37031 1727204405.86946: exiting _queue_task() for managed-node2/service 37031 1727204405.86962: done queuing things up, now waiting for results queue to drain 37031 1727204405.86963: waiting for pending results... 37031 1727204405.87146: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 37031 1727204405.87242: in run() - task 0affcd87-79f5-b754-dfb8-00000000007a 37031 1727204405.87254: variable 'ansible_search_path' from source: unknown 37031 1727204405.87260: variable 'ansible_search_path' from source: unknown 37031 1727204405.87289: calling self._execute() 37031 1727204405.87382: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.87394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.87401: variable 'omit' from source: magic vars 37031 1727204405.87699: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.87709: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204405.87825: variable 'network_provider' from source: set_fact 37031 1727204405.87828: variable 'network_state' from source: role '' defaults 37031 1727204405.87838: Evaluated conditional (network_provider == "nm" or network_state != {}): True 37031 1727204405.87842: variable 'omit' from source: magic vars 37031 1727204405.87887: variable 'omit' from source: magic vars 37031 1727204405.87906: variable 'network_service_name' from source: role '' defaults 37031 1727204405.87955: variable 'network_service_name' from source: role '' defaults 37031 1727204405.88033: variable '__network_provider_setup' from source: role '' defaults 37031 1727204405.88037: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204405.88089: variable '__network_service_name_default_nm' from source: role '' defaults 37031 1727204405.88095: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204405.88140: variable '__network_packages_default_nm' from source: role '' defaults 37031 1727204405.88297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204405.90331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204405.90393: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204405.90421: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204405.90447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204405.90471: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204405.90531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.90551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.90575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.90605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.90617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.90649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.90668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.90686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.90715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.90726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.90884: variable '__network_packages_default_gobject_packages' from source: role '' defaults 37031 1727204405.90968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.90984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.91001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.91029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.91039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.91106: variable 'ansible_python' from source: facts 37031 1727204405.91124: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 37031 1727204405.91188: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204405.91242: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204405.91329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.91347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.91371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.91395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.91406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.91438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204405.91458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204405.91483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.91538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204405.91542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204405.91669: variable 'network_connections' from source: task vars 37031 1727204405.91694: variable 'interface' from source: play vars 37031 1727204405.92090: variable 'interface' from source: play vars 37031 1727204405.92093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204405.92096: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204405.92098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204405.92169: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204405.92172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204405.92231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204405.92258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204405.92294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204405.92327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204405.92377: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.92657: variable 'network_connections' from source: task vars 37031 1727204405.92661: variable 'interface' from source: play vars 37031 1727204405.92733: variable 'interface' from source: play vars 37031 1727204405.92768: variable '__network_packages_default_wireless' from source: role '' defaults 37031 1727204405.92850: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204405.93153: variable 'network_connections' from source: task vars 37031 1727204405.93169: variable 'interface' from source: play vars 37031 1727204405.93242: variable 'interface' from source: play vars 37031 1727204405.93275: variable '__network_packages_default_team' from source: role '' defaults 37031 1727204405.93354: variable '__network_team_connections_defined' from source: role '' defaults 37031 1727204405.93655: variable 'network_connections' from source: task vars 37031 1727204405.93671: variable 'interface' from source: play vars 37031 1727204405.93742: variable 'interface' from source: play vars 37031 1727204405.93804: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204405.93870: variable '__network_service_name_default_initscripts' from source: role '' defaults 37031 1727204405.93882: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204405.93942: variable '__network_packages_default_initscripts' from source: role '' defaults 37031 1727204405.94168: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 37031 1727204405.94682: variable 'network_connections' from source: task vars 37031 1727204405.94692: variable 'interface' from source: play vars 37031 1727204405.94754: variable 'interface' from source: play vars 37031 1727204405.94773: variable 'ansible_distribution' from source: facts 37031 1727204405.94782: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.94792: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.94812: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 37031 1727204405.94989: variable 'ansible_distribution' from source: facts 37031 1727204405.94999: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.95009: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.95027: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 37031 1727204405.95202: variable 'ansible_distribution' from source: facts 37031 1727204405.95216: variable '__network_rh_distros' from source: role '' defaults 37031 1727204405.95219: variable 'ansible_distribution_major_version' from source: facts 37031 1727204405.95305: variable 'network_provider' from source: set_fact 37031 1727204405.95342: variable 'omit' from source: magic vars 37031 1727204405.95384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204405.95407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204405.95422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204405.95434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204405.95442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204405.95469: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204405.95472: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.95476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.95542: Set connection var ansible_connection to ssh 37031 1727204405.95545: Set connection var ansible_shell_type to sh 37031 1727204405.95551: Set connection var ansible_pipelining to False 37031 1727204405.95560: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204405.95563: Set connection var ansible_timeout to 10 37031 1727204405.95570: Set connection var ansible_shell_executable to /bin/sh 37031 1727204405.95593: variable 'ansible_shell_executable' from source: unknown 37031 1727204405.95596: variable 'ansible_connection' from source: unknown 37031 1727204405.95599: variable 'ansible_module_compression' from source: unknown 37031 1727204405.95601: variable 'ansible_shell_type' from source: unknown 37031 1727204405.95604: variable 'ansible_shell_executable' from source: unknown 37031 1727204405.95606: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204405.95609: variable 'ansible_pipelining' from source: unknown 37031 1727204405.95611: variable 'ansible_timeout' from source: unknown 37031 1727204405.95619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204405.95689: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204405.95698: variable 'omit' from source: magic vars 37031 1727204405.95705: starting attempt loop 37031 1727204405.95708: running the handler 37031 1727204405.95766: variable 'ansible_facts' from source: unknown 37031 1727204405.96245: _low_level_execute_command(): starting 37031 1727204405.96250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204405.96862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204405.96871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204405.97086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204405.97169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204405.97173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204405.97176: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204405.97178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204405.97180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204405.97182: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204405.97185: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204405.97187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204405.97210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204405.97213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204405.97216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204405.97218: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204405.97228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204405.97300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204405.97323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204405.97331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204405.97404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204405.99044: stdout chunk (state=3): >>>/root <<< 37031 1727204405.99185: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204405.99208: stderr chunk (state=3): >>><<< 37031 1727204405.99210: stdout chunk (state=3): >>><<< 37031 1727204405.99270: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204405.99274: _low_level_execute_command(): starting 37031 1727204405.99277: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470 `" && echo ansible-tmp-1727204405.99223-38922-174579196499470="` echo /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470 `" ) && sleep 0' 37031 1727204405.99692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204405.99697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204405.99706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204405.99722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204405.99753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204405.99765: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204405.99771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204405.99780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204405.99788: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204405.99793: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204405.99801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204405.99809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204405.99814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204405.99820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204405.99877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204405.99885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204405.99892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204405.99971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.01830: stdout chunk (state=3): >>>ansible-tmp-1727204405.99223-38922-174579196499470=/root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470 <<< 37031 1727204406.01939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.02032: stderr chunk (state=3): >>><<< 37031 1727204406.02046: stdout chunk (state=3): >>><<< 37031 1727204406.02170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204405.99223-38922-174579196499470=/root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.02181: variable 'ansible_module_compression' from source: unknown 37031 1727204406.02184: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 37031 1727204406.02373: variable 'ansible_facts' from source: unknown 37031 1727204406.02462: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/AnsiballZ_systemd.py 37031 1727204406.02630: Sending initial data 37031 1727204406.02634: Sent initial data (154 bytes) 37031 1727204406.03622: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.03631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.03644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.04103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.04410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.04418: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.04427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.04751: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.04763: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.04771: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.04782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.04794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.04806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.04814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.04820: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.04830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.04908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.04935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.04962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.05381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.06772: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204406.06809: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204406.06845: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp4ozxtiu9 /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/AnsiballZ_systemd.py <<< 37031 1727204406.06879: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204406.10069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.10155: stderr chunk (state=3): >>><<< 37031 1727204406.10158: stdout chunk (state=3): >>><<< 37031 1727204406.10186: done transferring module to remote 37031 1727204406.10199: _low_level_execute_command(): starting 37031 1727204406.10202: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/ /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/AnsiballZ_systemd.py && sleep 0' 37031 1727204406.10987: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.11005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.11019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.11037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.11084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.11098: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.11121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.11144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.11158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.11172: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.11194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.11219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.11237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.11255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.11269: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.11295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.11384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.11409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.11424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.11501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.13292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.13371: stderr chunk (state=3): >>><<< 37031 1727204406.13375: stdout chunk (state=3): >>><<< 37031 1727204406.13469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.13477: _low_level_execute_command(): starting 37031 1727204406.13480: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/AnsiballZ_systemd.py && sleep 0' 37031 1727204406.14073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.14088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.14102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.14118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.14162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.14178: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.14192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.14208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.14219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.14228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.14239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.14253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.14271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.14282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.14292: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.14304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.14387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.14406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.14420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.14509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.39770: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6983680", "MemoryAvailable": "infinity", "CPUUsageNSec": "1811057000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "Watchdog<<< 37031 1727204406.39796: stdout chunk (state=3): >>>Signal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 37031 1727204406.41400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204406.41404: stdout chunk (state=3): >>><<< 37031 1727204406.41411: stderr chunk (state=3): >>><<< 37031 1727204406.41599: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6823", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ExecMainStartTimestampMonotonic": "319366198", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "6823", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3602", "MemoryCurrent": "6983680", "MemoryAvailable": "infinity", "CPUUsageNSec": "1811057000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target network.service multi-user.target shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service cloud-init-local.service system.slice basic.target dbus.socket sysinit.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:30 EDT", "StateChangeTimestampMonotonic": "433536261", "InactiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveExitTimestampMonotonic": "319366492", "ActiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveEnterTimestampMonotonic": "319444795", "ActiveExitTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ActiveExitTimestampMonotonic": "319337881", "InactiveEnterTimestamp": "Tue 2024-09-24 14:52:36 EDT", "InactiveEnterTimestampMonotonic": "319361759", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:52:36 EDT", "ConditionTimestampMonotonic": "319362324", "AssertTimestamp": "Tue 2024-09-24 14:52:36 EDT", "AssertTimestampMonotonic": "319362327", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "bc82db972fb14d0fb9ce19d409aedafe", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204406.42139: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204406.42165: _low_level_execute_command(): starting 37031 1727204406.42169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204405.99223-38922-174579196499470/ > /dev/null 2>&1 && sleep 0' 37031 1727204406.42927: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.42936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.42946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.42968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.43013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.43020: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.43029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.43043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.43050: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.43056: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.43068: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.43078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.43092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.43101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.43108: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.43116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.43191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.43212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.43224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.43292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.45203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.45208: stderr chunk (state=3): >>><<< 37031 1727204406.45215: stdout chunk (state=3): >>><<< 37031 1727204406.45474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.45477: handler run complete 37031 1727204406.45480: attempt loop complete, returning result 37031 1727204406.45483: _execute() done 37031 1727204406.45485: dumping result to json 37031 1727204406.45487: done dumping result, returning 37031 1727204406.45489: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-b754-dfb8-00000000007a] 37031 1727204406.45491: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007a ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204406.46333: no more pending results, returning what we have 37031 1727204406.46337: results queue empty 37031 1727204406.46338: checking for any_errors_fatal 37031 1727204406.46343: done checking for any_errors_fatal 37031 1727204406.46344: checking for max_fail_percentage 37031 1727204406.46345: done checking for max_fail_percentage 37031 1727204406.46346: checking to see if all hosts have failed and the running result is not ok 37031 1727204406.46347: done checking to see if all hosts have failed 37031 1727204406.46348: getting the remaining hosts for this loop 37031 1727204406.46350: done getting the remaining hosts for this loop 37031 1727204406.46354: getting the next task for host managed-node2 37031 1727204406.46361: done getting next task for host managed-node2 37031 1727204406.46368: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 37031 1727204406.46372: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204406.46386: getting variables 37031 1727204406.46389: in VariableManager get_vars() 37031 1727204406.46427: Calling all_inventory to load vars for managed-node2 37031 1727204406.46430: Calling groups_inventory to load vars for managed-node2 37031 1727204406.46432: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204406.46443: Calling all_plugins_play to load vars for managed-node2 37031 1727204406.46445: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204406.46448: Calling groups_plugins_play to load vars for managed-node2 37031 1727204406.47426: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007a 37031 1727204406.47430: WORKER PROCESS EXITING 37031 1727204406.47845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204406.48938: done with get_vars() 37031 1727204406.48965: done getting variables 37031 1727204406.49028: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:00:06 -0400 (0:00:00.623) 0:00:29.035 ***** 37031 1727204406.49072: entering _queue_task() for managed-node2/service 37031 1727204406.49416: worker is 1 (out of 1 available) 37031 1727204406.49428: exiting _queue_task() for managed-node2/service 37031 1727204406.49441: done queuing things up, now waiting for results queue to drain 37031 1727204406.49443: waiting for pending results... 37031 1727204406.49753: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 37031 1727204406.49882: in run() - task 0affcd87-79f5-b754-dfb8-00000000007b 37031 1727204406.49901: variable 'ansible_search_path' from source: unknown 37031 1727204406.49905: variable 'ansible_search_path' from source: unknown 37031 1727204406.49943: calling self._execute() 37031 1727204406.50040: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.50043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.50053: variable 'omit' from source: magic vars 37031 1727204406.50420: variable 'ansible_distribution_major_version' from source: facts 37031 1727204406.50431: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204406.50518: variable 'network_provider' from source: set_fact 37031 1727204406.50521: Evaluated conditional (network_provider == "nm"): True 37031 1727204406.50591: variable '__network_wpa_supplicant_required' from source: role '' defaults 37031 1727204406.50653: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 37031 1727204406.50783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204406.52354: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204406.52405: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204406.52434: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204406.52461: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204406.52487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204406.52557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204406.52583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204406.52603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204406.52630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204406.52642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204406.52679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204406.52699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204406.52715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204406.52741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204406.52752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204406.52784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204406.52803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204406.52820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204406.52846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204406.52856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204406.52959: variable 'network_connections' from source: task vars 37031 1727204406.52974: variable 'interface' from source: play vars 37031 1727204406.53026: variable 'interface' from source: play vars 37031 1727204406.53081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204406.53195: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204406.53222: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204406.53248: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204406.53274: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204406.53306: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 37031 1727204406.53321: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 37031 1727204406.53343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204406.53362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 37031 1727204406.53401: variable '__network_wireless_connections_defined' from source: role '' defaults 37031 1727204406.53566: variable 'network_connections' from source: task vars 37031 1727204406.53570: variable 'interface' from source: play vars 37031 1727204406.53616: variable 'interface' from source: play vars 37031 1727204406.53639: Evaluated conditional (__network_wpa_supplicant_required): False 37031 1727204406.53642: when evaluation is False, skipping this task 37031 1727204406.53644: _execute() done 37031 1727204406.53647: dumping result to json 37031 1727204406.53651: done dumping result, returning 37031 1727204406.53658: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-b754-dfb8-00000000007b] 37031 1727204406.53671: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007b 37031 1727204406.53755: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007b 37031 1727204406.53758: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 37031 1727204406.53834: no more pending results, returning what we have 37031 1727204406.53839: results queue empty 37031 1727204406.53840: checking for any_errors_fatal 37031 1727204406.53867: done checking for any_errors_fatal 37031 1727204406.53868: checking for max_fail_percentage 37031 1727204406.53870: done checking for max_fail_percentage 37031 1727204406.53871: checking to see if all hosts have failed and the running result is not ok 37031 1727204406.53872: done checking to see if all hosts have failed 37031 1727204406.53872: getting the remaining hosts for this loop 37031 1727204406.53874: done getting the remaining hosts for this loop 37031 1727204406.53878: getting the next task for host managed-node2 37031 1727204406.53885: done getting next task for host managed-node2 37031 1727204406.53891: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 37031 1727204406.53894: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204406.53910: getting variables 37031 1727204406.53911: in VariableManager get_vars() 37031 1727204406.53948: Calling all_inventory to load vars for managed-node2 37031 1727204406.53950: Calling groups_inventory to load vars for managed-node2 37031 1727204406.53952: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204406.53960: Calling all_plugins_play to load vars for managed-node2 37031 1727204406.53963: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204406.53968: Calling groups_plugins_play to load vars for managed-node2 37031 1727204406.54772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204406.55689: done with get_vars() 37031 1727204406.55708: done getting variables 37031 1727204406.55757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:00:06 -0400 (0:00:00.067) 0:00:29.102 ***** 37031 1727204406.55783: entering _queue_task() for managed-node2/service 37031 1727204406.56020: worker is 1 (out of 1 available) 37031 1727204406.56033: exiting _queue_task() for managed-node2/service 37031 1727204406.56048: done queuing things up, now waiting for results queue to drain 37031 1727204406.56050: waiting for pending results... 37031 1727204406.56242: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 37031 1727204406.56335: in run() - task 0affcd87-79f5-b754-dfb8-00000000007c 37031 1727204406.56345: variable 'ansible_search_path' from source: unknown 37031 1727204406.56349: variable 'ansible_search_path' from source: unknown 37031 1727204406.56383: calling self._execute() 37031 1727204406.56456: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.56462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.56475: variable 'omit' from source: magic vars 37031 1727204406.56760: variable 'ansible_distribution_major_version' from source: facts 37031 1727204406.56770: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204406.56854: variable 'network_provider' from source: set_fact 37031 1727204406.56860: Evaluated conditional (network_provider == "initscripts"): False 37031 1727204406.56863: when evaluation is False, skipping this task 37031 1727204406.56868: _execute() done 37031 1727204406.56871: dumping result to json 37031 1727204406.56873: done dumping result, returning 37031 1727204406.56879: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-b754-dfb8-00000000007c] 37031 1727204406.56884: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007c 37031 1727204406.56972: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007c 37031 1727204406.56975: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 37031 1727204406.57020: no more pending results, returning what we have 37031 1727204406.57023: results queue empty 37031 1727204406.57024: checking for any_errors_fatal 37031 1727204406.57032: done checking for any_errors_fatal 37031 1727204406.57032: checking for max_fail_percentage 37031 1727204406.57034: done checking for max_fail_percentage 37031 1727204406.57035: checking to see if all hosts have failed and the running result is not ok 37031 1727204406.57036: done checking to see if all hosts have failed 37031 1727204406.57037: getting the remaining hosts for this loop 37031 1727204406.57038: done getting the remaining hosts for this loop 37031 1727204406.57042: getting the next task for host managed-node2 37031 1727204406.57049: done getting next task for host managed-node2 37031 1727204406.57053: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 37031 1727204406.57058: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204406.57079: getting variables 37031 1727204406.57081: in VariableManager get_vars() 37031 1727204406.57122: Calling all_inventory to load vars for managed-node2 37031 1727204406.57125: Calling groups_inventory to load vars for managed-node2 37031 1727204406.57127: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204406.57136: Calling all_plugins_play to load vars for managed-node2 37031 1727204406.57138: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204406.57140: Calling groups_plugins_play to load vars for managed-node2 37031 1727204406.58072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204406.58975: done with get_vars() 37031 1727204406.58992: done getting variables 37031 1727204406.59037: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:00:06 -0400 (0:00:00.032) 0:00:29.135 ***** 37031 1727204406.59066: entering _queue_task() for managed-node2/copy 37031 1727204406.59297: worker is 1 (out of 1 available) 37031 1727204406.59312: exiting _queue_task() for managed-node2/copy 37031 1727204406.59325: done queuing things up, now waiting for results queue to drain 37031 1727204406.59327: waiting for pending results... 37031 1727204406.59512: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 37031 1727204406.59613: in run() - task 0affcd87-79f5-b754-dfb8-00000000007d 37031 1727204406.59624: variable 'ansible_search_path' from source: unknown 37031 1727204406.59628: variable 'ansible_search_path' from source: unknown 37031 1727204406.59660: calling self._execute() 37031 1727204406.59731: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.59735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.59744: variable 'omit' from source: magic vars 37031 1727204406.60025: variable 'ansible_distribution_major_version' from source: facts 37031 1727204406.60034: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204406.60114: variable 'network_provider' from source: set_fact 37031 1727204406.60120: Evaluated conditional (network_provider == "initscripts"): False 37031 1727204406.60123: when evaluation is False, skipping this task 37031 1727204406.60125: _execute() done 37031 1727204406.60129: dumping result to json 37031 1727204406.60131: done dumping result, returning 37031 1727204406.60139: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-b754-dfb8-00000000007d] 37031 1727204406.60143: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007d 37031 1727204406.60232: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007d 37031 1727204406.60235: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 37031 1727204406.60287: no more pending results, returning what we have 37031 1727204406.60292: results queue empty 37031 1727204406.60293: checking for any_errors_fatal 37031 1727204406.60299: done checking for any_errors_fatal 37031 1727204406.60300: checking for max_fail_percentage 37031 1727204406.60302: done checking for max_fail_percentage 37031 1727204406.60303: checking to see if all hosts have failed and the running result is not ok 37031 1727204406.60304: done checking to see if all hosts have failed 37031 1727204406.60304: getting the remaining hosts for this loop 37031 1727204406.60306: done getting the remaining hosts for this loop 37031 1727204406.60310: getting the next task for host managed-node2 37031 1727204406.60317: done getting next task for host managed-node2 37031 1727204406.60321: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 37031 1727204406.60324: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204406.60341: getting variables 37031 1727204406.60343: in VariableManager get_vars() 37031 1727204406.60388: Calling all_inventory to load vars for managed-node2 37031 1727204406.60391: Calling groups_inventory to load vars for managed-node2 37031 1727204406.60393: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204406.60401: Calling all_plugins_play to load vars for managed-node2 37031 1727204406.60403: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204406.60406: Calling groups_plugins_play to load vars for managed-node2 37031 1727204406.61188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204406.62103: done with get_vars() 37031 1727204406.62119: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:00:06 -0400 (0:00:00.031) 0:00:29.166 ***** 37031 1727204406.62187: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 37031 1727204406.62410: worker is 1 (out of 1 available) 37031 1727204406.62423: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 37031 1727204406.62437: done queuing things up, now waiting for results queue to drain 37031 1727204406.62438: waiting for pending results... 37031 1727204406.62629: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 37031 1727204406.62726: in run() - task 0affcd87-79f5-b754-dfb8-00000000007e 37031 1727204406.62737: variable 'ansible_search_path' from source: unknown 37031 1727204406.62745: variable 'ansible_search_path' from source: unknown 37031 1727204406.62779: calling self._execute() 37031 1727204406.62854: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.62860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.62868: variable 'omit' from source: magic vars 37031 1727204406.63133: variable 'ansible_distribution_major_version' from source: facts 37031 1727204406.63143: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204406.63149: variable 'omit' from source: magic vars 37031 1727204406.63196: variable 'omit' from source: magic vars 37031 1727204406.63312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204406.65105: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204406.65143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204406.65181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204406.65206: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204406.65227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204406.65290: variable 'network_provider' from source: set_fact 37031 1727204406.65385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204406.65405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204406.65422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204406.65448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204406.65462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204406.65515: variable 'omit' from source: magic vars 37031 1727204406.65601: variable 'omit' from source: magic vars 37031 1727204406.65675: variable 'network_connections' from source: task vars 37031 1727204406.65686: variable 'interface' from source: play vars 37031 1727204406.65731: variable 'interface' from source: play vars 37031 1727204406.65834: variable 'omit' from source: magic vars 37031 1727204406.65841: variable '__lsr_ansible_managed' from source: task vars 37031 1727204406.65886: variable '__lsr_ansible_managed' from source: task vars 37031 1727204406.66082: Loaded config def from plugin (lookup/template) 37031 1727204406.66085: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 37031 1727204406.66108: File lookup term: get_ansible_managed.j2 37031 1727204406.66112: variable 'ansible_search_path' from source: unknown 37031 1727204406.66115: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 37031 1727204406.66128: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 37031 1727204406.66141: variable 'ansible_search_path' from source: unknown 37031 1727204406.71105: variable 'ansible_managed' from source: unknown 37031 1727204406.71268: variable 'omit' from source: magic vars 37031 1727204406.71297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204406.71324: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204406.71343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204406.71365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204406.71376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204406.71404: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204406.71408: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.71410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.71506: Set connection var ansible_connection to ssh 37031 1727204406.71509: Set connection var ansible_shell_type to sh 37031 1727204406.71516: Set connection var ansible_pipelining to False 37031 1727204406.71524: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204406.71530: Set connection var ansible_timeout to 10 37031 1727204406.71535: Set connection var ansible_shell_executable to /bin/sh 37031 1727204406.71568: variable 'ansible_shell_executable' from source: unknown 37031 1727204406.71571: variable 'ansible_connection' from source: unknown 37031 1727204406.71573: variable 'ansible_module_compression' from source: unknown 37031 1727204406.71575: variable 'ansible_shell_type' from source: unknown 37031 1727204406.71578: variable 'ansible_shell_executable' from source: unknown 37031 1727204406.71580: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204406.71584: variable 'ansible_pipelining' from source: unknown 37031 1727204406.71586: variable 'ansible_timeout' from source: unknown 37031 1727204406.71591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204406.71728: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204406.71740: variable 'omit' from source: magic vars 37031 1727204406.71743: starting attempt loop 37031 1727204406.71745: running the handler 37031 1727204406.71758: _low_level_execute_command(): starting 37031 1727204406.71771: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204406.72508: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.72520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.72532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.72548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.72591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.72598: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.72608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.72622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.72630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.72636: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.72644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.72653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.72671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.72680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.72683: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.72692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.72768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.72787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.72802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.72880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.74527: stdout chunk (state=3): >>>/root <<< 37031 1727204406.74704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.74708: stdout chunk (state=3): >>><<< 37031 1727204406.74718: stderr chunk (state=3): >>><<< 37031 1727204406.74739: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.74752: _low_level_execute_command(): starting 37031 1727204406.74756: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126 `" && echo ansible-tmp-1727204406.7473967-38959-216226615939126="` echo /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126 `" ) && sleep 0' 37031 1727204406.75420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.75429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.75440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.75454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.75500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.75507: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.75527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.75530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.75538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.75544: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.75555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.75566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.75580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.75586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.75593: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.75603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.75676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.75691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.75694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.75879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.77671: stdout chunk (state=3): >>>ansible-tmp-1727204406.7473967-38959-216226615939126=/root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126 <<< 37031 1727204406.77897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.77900: stdout chunk (state=3): >>><<< 37031 1727204406.77903: stderr chunk (state=3): >>><<< 37031 1727204406.78280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204406.7473967-38959-216226615939126=/root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.78288: variable 'ansible_module_compression' from source: unknown 37031 1727204406.78291: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 37031 1727204406.78293: variable 'ansible_facts' from source: unknown 37031 1727204406.78295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/AnsiballZ_network_connections.py 37031 1727204406.78380: Sending initial data 37031 1727204406.78383: Sent initial data (168 bytes) 37031 1727204406.79380: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.79395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.79410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.79428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.79484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.79498: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.79511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.79528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.79538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.79548: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.79558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.79575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.79591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.79607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.79618: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.79631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.79712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.79735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.79752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.79832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.81595: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204406.81660: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204406.81710: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpjwl24nc7 /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/AnsiballZ_network_connections.py <<< 37031 1727204406.81745: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204406.83630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.83816: stderr chunk (state=3): >>><<< 37031 1727204406.83819: stdout chunk (state=3): >>><<< 37031 1727204406.83822: done transferring module to remote 37031 1727204406.83824: _low_level_execute_command(): starting 37031 1727204406.83826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/ /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/AnsiballZ_network_connections.py && sleep 0' 37031 1727204406.85413: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.85446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.85472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.85516: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.85581: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.85596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.85614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.85627: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.85640: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.85652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.85671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.85688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.85700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.85712: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.85726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.85923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.85941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.85955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.86082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204406.87884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204406.87951: stderr chunk (state=3): >>><<< 37031 1727204406.87954: stdout chunk (state=3): >>><<< 37031 1727204406.88053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204406.88061: _low_level_execute_command(): starting 37031 1727204406.88065: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/AnsiballZ_network_connections.py && sleep 0' 37031 1727204406.89526: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204406.89596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.89612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.89708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.89749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.89808: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204406.89824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.89844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204406.89857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204406.89871: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204406.89883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204406.89898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204406.89917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204406.90025: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204406.90038: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204406.90053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204406.90195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204406.90252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204406.90272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204406.90369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.20115: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jmh3trmd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jmh3trmd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/3d37e8b2-4205-4a19-9842-5a81810c6006: error=unknown <<< 37031 1727204407.20333: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 37031 1727204407.21949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204407.21953: stdout chunk (state=3): >>><<< 37031 1727204407.21956: stderr chunk (state=3): >>><<< 37031 1727204407.22099: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jmh3trmd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jmh3trmd/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/3d37e8b2-4205-4a19-9842-5a81810c6006: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204407.22104: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204407.22107: _low_level_execute_command(): starting 37031 1727204407.22114: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204406.7473967-38959-216226615939126/ > /dev/null 2>&1 && sleep 0' 37031 1727204407.22672: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.22696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.22712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.22731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.22775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.22790: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.22804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.22823: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.22836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.22847: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.22859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.22877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.22894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.22905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.22916: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.22929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.23006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.23023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.23037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.23118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.25058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.25062: stdout chunk (state=3): >>><<< 37031 1727204407.25090: stderr chunk (state=3): >>><<< 37031 1727204407.25133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204407.25139: handler run complete 37031 1727204407.25216: attempt loop complete, returning result 37031 1727204407.25220: _execute() done 37031 1727204407.25222: dumping result to json 37031 1727204407.25238: done dumping result, returning 37031 1727204407.25241: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-b754-dfb8-00000000007e] 37031 1727204407.25243: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007e 37031 1727204407.25368: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007e 37031 1727204407.25371: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 37031 1727204407.25484: no more pending results, returning what we have 37031 1727204407.25489: results queue empty 37031 1727204407.25490: checking for any_errors_fatal 37031 1727204407.25496: done checking for any_errors_fatal 37031 1727204407.25497: checking for max_fail_percentage 37031 1727204407.25499: done checking for max_fail_percentage 37031 1727204407.25500: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.25501: done checking to see if all hosts have failed 37031 1727204407.25502: getting the remaining hosts for this loop 37031 1727204407.25504: done getting the remaining hosts for this loop 37031 1727204407.25508: getting the next task for host managed-node2 37031 1727204407.25515: done getting next task for host managed-node2 37031 1727204407.25519: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 37031 1727204407.25522: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.25533: getting variables 37031 1727204407.25535: in VariableManager get_vars() 37031 1727204407.25580: Calling all_inventory to load vars for managed-node2 37031 1727204407.25582: Calling groups_inventory to load vars for managed-node2 37031 1727204407.25585: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.25594: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.25596: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.25599: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.28910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204407.30667: done with get_vars() 37031 1727204407.30691: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:00:07 -0400 (0:00:00.685) 0:00:29.852 ***** 37031 1727204407.30758: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 37031 1727204407.30999: worker is 1 (out of 1 available) 37031 1727204407.31012: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 37031 1727204407.31029: done queuing things up, now waiting for results queue to drain 37031 1727204407.31033: waiting for pending results... 37031 1727204407.31377: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 37031 1727204407.31490: in run() - task 0affcd87-79f5-b754-dfb8-00000000007f 37031 1727204407.31504: variable 'ansible_search_path' from source: unknown 37031 1727204407.31508: variable 'ansible_search_path' from source: unknown 37031 1727204407.31555: calling self._execute() 37031 1727204407.31655: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.31665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.31677: variable 'omit' from source: magic vars 37031 1727204407.32437: variable 'ansible_distribution_major_version' from source: facts 37031 1727204407.32474: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204407.32622: variable 'network_state' from source: role '' defaults 37031 1727204407.32638: Evaluated conditional (network_state != {}): False 37031 1727204407.32644: when evaluation is False, skipping this task 37031 1727204407.32651: _execute() done 37031 1727204407.32658: dumping result to json 37031 1727204407.32669: done dumping result, returning 37031 1727204407.32679: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-b754-dfb8-00000000007f] 37031 1727204407.32688: sending task result for task 0affcd87-79f5-b754-dfb8-00000000007f 37031 1727204407.32804: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000007f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 37031 1727204407.32863: no more pending results, returning what we have 37031 1727204407.32869: results queue empty 37031 1727204407.32870: checking for any_errors_fatal 37031 1727204407.32881: done checking for any_errors_fatal 37031 1727204407.32882: checking for max_fail_percentage 37031 1727204407.32884: done checking for max_fail_percentage 37031 1727204407.32885: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.32887: done checking to see if all hosts have failed 37031 1727204407.32887: getting the remaining hosts for this loop 37031 1727204407.32889: done getting the remaining hosts for this loop 37031 1727204407.32893: getting the next task for host managed-node2 37031 1727204407.32901: done getting next task for host managed-node2 37031 1727204407.32906: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 37031 1727204407.32910: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.32931: getting variables 37031 1727204407.32933: in VariableManager get_vars() 37031 1727204407.32984: Calling all_inventory to load vars for managed-node2 37031 1727204407.32987: Calling groups_inventory to load vars for managed-node2 37031 1727204407.32989: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.33002: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.33005: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.33008: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.33940: WORKER PROCESS EXITING 37031 1727204407.34161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204407.36241: done with get_vars() 37031 1727204407.36275: done getting variables 37031 1727204407.36323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:00:07 -0400 (0:00:00.055) 0:00:29.908 ***** 37031 1727204407.36349: entering _queue_task() for managed-node2/debug 37031 1727204407.36595: worker is 1 (out of 1 available) 37031 1727204407.36608: exiting _queue_task() for managed-node2/debug 37031 1727204407.36621: done queuing things up, now waiting for results queue to drain 37031 1727204407.36622: waiting for pending results... 37031 1727204407.36823: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 37031 1727204407.36917: in run() - task 0affcd87-79f5-b754-dfb8-000000000080 37031 1727204407.36929: variable 'ansible_search_path' from source: unknown 37031 1727204407.36933: variable 'ansible_search_path' from source: unknown 37031 1727204407.36970: calling self._execute() 37031 1727204407.37039: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.37044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.37051: variable 'omit' from source: magic vars 37031 1727204407.37338: variable 'ansible_distribution_major_version' from source: facts 37031 1727204407.37348: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204407.37354: variable 'omit' from source: magic vars 37031 1727204407.37400: variable 'omit' from source: magic vars 37031 1727204407.37425: variable 'omit' from source: magic vars 37031 1727204407.37458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204407.37490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204407.37507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204407.37521: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.37530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.37555: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204407.37558: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.37565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.37636: Set connection var ansible_connection to ssh 37031 1727204407.37639: Set connection var ansible_shell_type to sh 37031 1727204407.37645: Set connection var ansible_pipelining to False 37031 1727204407.37652: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204407.37657: Set connection var ansible_timeout to 10 37031 1727204407.37665: Set connection var ansible_shell_executable to /bin/sh 37031 1727204407.37693: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.37696: variable 'ansible_connection' from source: unknown 37031 1727204407.37700: variable 'ansible_module_compression' from source: unknown 37031 1727204407.37703: variable 'ansible_shell_type' from source: unknown 37031 1727204407.37705: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.37707: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.37709: variable 'ansible_pipelining' from source: unknown 37031 1727204407.37712: variable 'ansible_timeout' from source: unknown 37031 1727204407.37715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.37896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204407.37904: variable 'omit' from source: magic vars 37031 1727204407.37940: starting attempt loop 37031 1727204407.37943: running the handler 37031 1727204407.38045: variable '__network_connections_result' from source: set_fact 37031 1727204407.38111: handler run complete 37031 1727204407.38133: attempt loop complete, returning result 37031 1727204407.38143: _execute() done 37031 1727204407.38150: dumping result to json 37031 1727204407.38163: done dumping result, returning 37031 1727204407.38178: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-b754-dfb8-000000000080] 37031 1727204407.38193: sending task result for task 0affcd87-79f5-b754-dfb8-000000000080 ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 37031 1727204407.38368: no more pending results, returning what we have 37031 1727204407.38380: results queue empty 37031 1727204407.38381: checking for any_errors_fatal 37031 1727204407.38388: done checking for any_errors_fatal 37031 1727204407.38388: checking for max_fail_percentage 37031 1727204407.38390: done checking for max_fail_percentage 37031 1727204407.38391: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.38392: done checking to see if all hosts have failed 37031 1727204407.38393: getting the remaining hosts for this loop 37031 1727204407.38395: done getting the remaining hosts for this loop 37031 1727204407.38399: getting the next task for host managed-node2 37031 1727204407.38410: done getting next task for host managed-node2 37031 1727204407.38415: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 37031 1727204407.38419: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.38432: getting variables 37031 1727204407.38434: in VariableManager get_vars() 37031 1727204407.38494: Calling all_inventory to load vars for managed-node2 37031 1727204407.38497: Calling groups_inventory to load vars for managed-node2 37031 1727204407.38500: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.38631: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000080 37031 1727204407.38634: WORKER PROCESS EXITING 37031 1727204407.38644: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.38647: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.38650: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.40194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204407.41203: done with get_vars() 37031 1727204407.41226: done getting variables 37031 1727204407.41278: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:00:07 -0400 (0:00:00.049) 0:00:29.957 ***** 37031 1727204407.41304: entering _queue_task() for managed-node2/debug 37031 1727204407.41551: worker is 1 (out of 1 available) 37031 1727204407.41566: exiting _queue_task() for managed-node2/debug 37031 1727204407.41580: done queuing things up, now waiting for results queue to drain 37031 1727204407.41582: waiting for pending results... 37031 1727204407.41784: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 37031 1727204407.41883: in run() - task 0affcd87-79f5-b754-dfb8-000000000081 37031 1727204407.41897: variable 'ansible_search_path' from source: unknown 37031 1727204407.41901: variable 'ansible_search_path' from source: unknown 37031 1727204407.41932: calling self._execute() 37031 1727204407.42004: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.42055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.42059: variable 'omit' from source: magic vars 37031 1727204407.42470: variable 'ansible_distribution_major_version' from source: facts 37031 1727204407.42497: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204407.42509: variable 'omit' from source: magic vars 37031 1727204407.42571: variable 'omit' from source: magic vars 37031 1727204407.42622: variable 'omit' from source: magic vars 37031 1727204407.42667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204407.42713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204407.42737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204407.42760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.42778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.42813: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204407.42830: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.42840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.43052: Set connection var ansible_connection to ssh 37031 1727204407.43059: Set connection var ansible_shell_type to sh 37031 1727204407.43073: Set connection var ansible_pipelining to False 37031 1727204407.43087: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204407.43096: Set connection var ansible_timeout to 10 37031 1727204407.43106: Set connection var ansible_shell_executable to /bin/sh 37031 1727204407.43180: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.43207: variable 'ansible_connection' from source: unknown 37031 1727204407.43210: variable 'ansible_module_compression' from source: unknown 37031 1727204407.43212: variable 'ansible_shell_type' from source: unknown 37031 1727204407.43215: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.43217: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.43219: variable 'ansible_pipelining' from source: unknown 37031 1727204407.43220: variable 'ansible_timeout' from source: unknown 37031 1727204407.43222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.43324: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204407.43333: variable 'omit' from source: magic vars 37031 1727204407.43347: starting attempt loop 37031 1727204407.43350: running the handler 37031 1727204407.43392: variable '__network_connections_result' from source: set_fact 37031 1727204407.43451: variable '__network_connections_result' from source: set_fact 37031 1727204407.43531: handler run complete 37031 1727204407.43547: attempt loop complete, returning result 37031 1727204407.43550: _execute() done 37031 1727204407.43554: dumping result to json 37031 1727204407.43562: done dumping result, returning 37031 1727204407.43570: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-b754-dfb8-000000000081] 37031 1727204407.43575: sending task result for task 0affcd87-79f5-b754-dfb8-000000000081 37031 1727204407.43668: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000081 37031 1727204407.43671: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 37031 1727204407.43761: no more pending results, returning what we have 37031 1727204407.43766: results queue empty 37031 1727204407.43767: checking for any_errors_fatal 37031 1727204407.43774: done checking for any_errors_fatal 37031 1727204407.43775: checking for max_fail_percentage 37031 1727204407.43776: done checking for max_fail_percentage 37031 1727204407.43777: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.43778: done checking to see if all hosts have failed 37031 1727204407.43779: getting the remaining hosts for this loop 37031 1727204407.43781: done getting the remaining hosts for this loop 37031 1727204407.43785: getting the next task for host managed-node2 37031 1727204407.43791: done getting next task for host managed-node2 37031 1727204407.43795: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 37031 1727204407.43799: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.43809: getting variables 37031 1727204407.43810: in VariableManager get_vars() 37031 1727204407.43845: Calling all_inventory to load vars for managed-node2 37031 1727204407.43848: Calling groups_inventory to load vars for managed-node2 37031 1727204407.43850: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.43858: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.43860: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.43865: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.44789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204407.45692: done with get_vars() 37031 1727204407.45709: done getting variables 37031 1727204407.45757: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:00:07 -0400 (0:00:00.044) 0:00:30.002 ***** 37031 1727204407.45789: entering _queue_task() for managed-node2/debug 37031 1727204407.46023: worker is 1 (out of 1 available) 37031 1727204407.46038: exiting _queue_task() for managed-node2/debug 37031 1727204407.46051: done queuing things up, now waiting for results queue to drain 37031 1727204407.46052: waiting for pending results... 37031 1727204407.46266: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 37031 1727204407.46355: in run() - task 0affcd87-79f5-b754-dfb8-000000000082 37031 1727204407.46369: variable 'ansible_search_path' from source: unknown 37031 1727204407.46373: variable 'ansible_search_path' from source: unknown 37031 1727204407.46409: calling self._execute() 37031 1727204407.46479: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.46483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.46492: variable 'omit' from source: magic vars 37031 1727204407.46780: variable 'ansible_distribution_major_version' from source: facts 37031 1727204407.46790: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204407.46879: variable 'network_state' from source: role '' defaults 37031 1727204407.46888: Evaluated conditional (network_state != {}): False 37031 1727204407.46891: when evaluation is False, skipping this task 37031 1727204407.46894: _execute() done 37031 1727204407.46896: dumping result to json 37031 1727204407.46899: done dumping result, returning 37031 1727204407.46906: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-b754-dfb8-000000000082] 37031 1727204407.46911: sending task result for task 0affcd87-79f5-b754-dfb8-000000000082 37031 1727204407.46997: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000082 37031 1727204407.47000: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 37031 1727204407.47044: no more pending results, returning what we have 37031 1727204407.47048: results queue empty 37031 1727204407.47049: checking for any_errors_fatal 37031 1727204407.47057: done checking for any_errors_fatal 37031 1727204407.47058: checking for max_fail_percentage 37031 1727204407.47060: done checking for max_fail_percentage 37031 1727204407.47060: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.47061: done checking to see if all hosts have failed 37031 1727204407.47062: getting the remaining hosts for this loop 37031 1727204407.47065: done getting the remaining hosts for this loop 37031 1727204407.47069: getting the next task for host managed-node2 37031 1727204407.47076: done getting next task for host managed-node2 37031 1727204407.47081: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 37031 1727204407.47084: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.47210: getting variables 37031 1727204407.47213: in VariableManager get_vars() 37031 1727204407.47252: Calling all_inventory to load vars for managed-node2 37031 1727204407.47254: Calling groups_inventory to load vars for managed-node2 37031 1727204407.47259: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.47270: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.47273: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.47276: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.48555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204407.49469: done with get_vars() 37031 1727204407.49486: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:00:07 -0400 (0:00:00.037) 0:00:30.040 ***** 37031 1727204407.49555: entering _queue_task() for managed-node2/ping 37031 1727204407.49786: worker is 1 (out of 1 available) 37031 1727204407.49799: exiting _queue_task() for managed-node2/ping 37031 1727204407.49812: done queuing things up, now waiting for results queue to drain 37031 1727204407.49813: waiting for pending results... 37031 1727204407.50001: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 37031 1727204407.50104: in run() - task 0affcd87-79f5-b754-dfb8-000000000083 37031 1727204407.50116: variable 'ansible_search_path' from source: unknown 37031 1727204407.50121: variable 'ansible_search_path' from source: unknown 37031 1727204407.50151: calling self._execute() 37031 1727204407.50253: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.50271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.50283: variable 'omit' from source: magic vars 37031 1727204407.50669: variable 'ansible_distribution_major_version' from source: facts 37031 1727204407.50694: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204407.50704: variable 'omit' from source: magic vars 37031 1727204407.50770: variable 'omit' from source: magic vars 37031 1727204407.50819: variable 'omit' from source: magic vars 37031 1727204407.51168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204407.51172: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204407.51175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204407.51178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.51180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204407.51183: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204407.51185: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.51187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.51189: Set connection var ansible_connection to ssh 37031 1727204407.51191: Set connection var ansible_shell_type to sh 37031 1727204407.51193: Set connection var ansible_pipelining to False 37031 1727204407.51195: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204407.51197: Set connection var ansible_timeout to 10 37031 1727204407.51199: Set connection var ansible_shell_executable to /bin/sh 37031 1727204407.51201: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.51202: variable 'ansible_connection' from source: unknown 37031 1727204407.51204: variable 'ansible_module_compression' from source: unknown 37031 1727204407.51206: variable 'ansible_shell_type' from source: unknown 37031 1727204407.51208: variable 'ansible_shell_executable' from source: unknown 37031 1727204407.51210: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204407.51212: variable 'ansible_pipelining' from source: unknown 37031 1727204407.51214: variable 'ansible_timeout' from source: unknown 37031 1727204407.51216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204407.51361: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 37031 1727204407.51370: variable 'omit' from source: magic vars 37031 1727204407.51376: starting attempt loop 37031 1727204407.51379: running the handler 37031 1727204407.51393: _low_level_execute_command(): starting 37031 1727204407.51400: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204407.52122: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.52133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.52144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.52160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.52205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.52210: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.52220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.52234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.52242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.52249: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.52259: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.52267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.52281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.52288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.52295: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.52305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.52379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.52398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.52411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.52485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.54135: stdout chunk (state=3): >>>/root <<< 37031 1727204407.54319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.54325: stdout chunk (state=3): >>><<< 37031 1727204407.54335: stderr chunk (state=3): >>><<< 37031 1727204407.54364: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204407.54382: _low_level_execute_command(): starting 37031 1727204407.54390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720 `" && echo ansible-tmp-1727204407.5436583-39006-33433827742720="` echo /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720 `" ) && sleep 0' 37031 1727204407.55070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.55080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.55090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.55104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.55146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.55153: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.55162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.55181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.55188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.55194: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.55202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.55211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.55222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.55228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.55236: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.55245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.55321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.55336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.55340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.55423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.57349: stdout chunk (state=3): >>>ansible-tmp-1727204407.5436583-39006-33433827742720=/root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720 <<< 37031 1727204407.57484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.57488: stderr chunk (state=3): >>><<< 37031 1727204407.57491: stdout chunk (state=3): >>><<< 37031 1727204407.57515: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204407.5436583-39006-33433827742720=/root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204407.57568: variable 'ansible_module_compression' from source: unknown 37031 1727204407.57616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 37031 1727204407.57651: variable 'ansible_facts' from source: unknown 37031 1727204407.57715: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/AnsiballZ_ping.py 37031 1727204407.58374: Sending initial data 37031 1727204407.58378: Sent initial data (152 bytes) 37031 1727204407.59966: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.59988: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.59998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.60012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.60052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.60061: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.60071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.60090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.60098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.60106: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.60115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.60127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.60138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.60146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.60153: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.60162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.60248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.60267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.60270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.60340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.62132: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204407.62162: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204407.62203: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpbuu_gtm5 /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/AnsiballZ_ping.py <<< 37031 1727204407.62250: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204407.63887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.63970: stderr chunk (state=3): >>><<< 37031 1727204407.63974: stdout chunk (state=3): >>><<< 37031 1727204407.64073: done transferring module to remote 37031 1727204407.64076: _low_level_execute_command(): starting 37031 1727204407.64081: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/ /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/AnsiballZ_ping.py && sleep 0' 37031 1727204407.65715: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.65722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.65733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.65747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.65789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.65796: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.65807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.65820: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.65828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.65834: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.65842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.65850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.65862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.65875: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.65882: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.65892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.65962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.66684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.66690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.66761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.68660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.68666: stdout chunk (state=3): >>><<< 37031 1727204407.68676: stderr chunk (state=3): >>><<< 37031 1727204407.68694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204407.68697: _low_level_execute_command(): starting 37031 1727204407.68702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/AnsiballZ_ping.py && sleep 0' 37031 1727204407.70072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.70682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.70692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.70705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.70745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.70751: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.70763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.70779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.70786: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.70792: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.70799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.70809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.70819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.70827: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.70835: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.70843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.70922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.70937: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.70943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.71293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.84329: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 37031 1727204407.85388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204407.85445: stderr chunk (state=3): >>><<< 37031 1727204407.85448: stdout chunk (state=3): >>><<< 37031 1727204407.85583: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204407.85587: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204407.85594: _low_level_execute_command(): starting 37031 1727204407.85597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204407.5436583-39006-33433827742720/ > /dev/null 2>&1 && sleep 0' 37031 1727204407.86292: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204407.86305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.86319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.86338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.86393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.86404: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204407.86417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.86433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204407.86446: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204407.86468: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204407.86484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204407.86498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204407.86513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204407.86524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204407.86534: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204407.86545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204407.86632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204407.86653: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204407.86677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204407.86781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204407.88630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204407.88685: stderr chunk (state=3): >>><<< 37031 1727204407.88689: stdout chunk (state=3): >>><<< 37031 1727204407.88882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204407.88885: handler run complete 37031 1727204407.88888: attempt loop complete, returning result 37031 1727204407.88890: _execute() done 37031 1727204407.88891: dumping result to json 37031 1727204407.88893: done dumping result, returning 37031 1727204407.88895: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-b754-dfb8-000000000083] 37031 1727204407.88897: sending task result for task 0affcd87-79f5-b754-dfb8-000000000083 37031 1727204407.88977: done sending task result for task 0affcd87-79f5-b754-dfb8-000000000083 37031 1727204407.88980: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 37031 1727204407.89052: no more pending results, returning what we have 37031 1727204407.89056: results queue empty 37031 1727204407.89057: checking for any_errors_fatal 37031 1727204407.89063: done checking for any_errors_fatal 37031 1727204407.89066: checking for max_fail_percentage 37031 1727204407.89068: done checking for max_fail_percentage 37031 1727204407.89069: checking to see if all hosts have failed and the running result is not ok 37031 1727204407.89070: done checking to see if all hosts have failed 37031 1727204407.89071: getting the remaining hosts for this loop 37031 1727204407.89073: done getting the remaining hosts for this loop 37031 1727204407.89077: getting the next task for host managed-node2 37031 1727204407.89088: done getting next task for host managed-node2 37031 1727204407.89093: ^ task is: TASK: meta (role_complete) 37031 1727204407.89097: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204407.89110: getting variables 37031 1727204407.89112: in VariableManager get_vars() 37031 1727204407.89161: Calling all_inventory to load vars for managed-node2 37031 1727204407.89167: Calling groups_inventory to load vars for managed-node2 37031 1727204407.89170: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204407.89182: Calling all_plugins_play to load vars for managed-node2 37031 1727204407.89184: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204407.89187: Calling groups_plugins_play to load vars for managed-node2 37031 1727204407.98645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.00301: done with get_vars() 37031 1727204408.00330: done getting variables 37031 1727204408.00412: done queuing things up, now waiting for results queue to drain 37031 1727204408.00415: results queue empty 37031 1727204408.00416: checking for any_errors_fatal 37031 1727204408.00419: done checking for any_errors_fatal 37031 1727204408.00420: checking for max_fail_percentage 37031 1727204408.00422: done checking for max_fail_percentage 37031 1727204408.00422: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.00423: done checking to see if all hosts have failed 37031 1727204408.00424: getting the remaining hosts for this loop 37031 1727204408.00425: done getting the remaining hosts for this loop 37031 1727204408.00428: getting the next task for host managed-node2 37031 1727204408.00432: done getting next task for host managed-node2 37031 1727204408.00434: ^ task is: TASK: Include the task 'manage_test_interface.yml' 37031 1727204408.00436: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204408.00439: getting variables 37031 1727204408.00445: in VariableManager get_vars() 37031 1727204408.00467: Calling all_inventory to load vars for managed-node2 37031 1727204408.00470: Calling groups_inventory to load vars for managed-node2 37031 1727204408.00472: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.00477: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.00479: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.00482: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.01669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.03297: done with get_vars() 37031 1727204408.03316: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.538) 0:00:30.578 ***** 37031 1727204408.03372: entering _queue_task() for managed-node2/include_tasks 37031 1727204408.03618: worker is 1 (out of 1 available) 37031 1727204408.03631: exiting _queue_task() for managed-node2/include_tasks 37031 1727204408.03645: done queuing things up, now waiting for results queue to drain 37031 1727204408.03647: waiting for pending results... 37031 1727204408.03840: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 37031 1727204408.03917: in run() - task 0affcd87-79f5-b754-dfb8-0000000000b3 37031 1727204408.03928: variable 'ansible_search_path' from source: unknown 37031 1727204408.03966: calling self._execute() 37031 1727204408.04034: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.04040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.04051: variable 'omit' from source: magic vars 37031 1727204408.04342: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.04352: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.04360: _execute() done 37031 1727204408.04365: dumping result to json 37031 1727204408.04368: done dumping result, returning 37031 1727204408.04374: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [0affcd87-79f5-b754-dfb8-0000000000b3] 37031 1727204408.04376: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b3 37031 1727204408.04477: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b3 37031 1727204408.04480: WORKER PROCESS EXITING 37031 1727204408.04509: no more pending results, returning what we have 37031 1727204408.04514: in VariableManager get_vars() 37031 1727204408.04566: Calling all_inventory to load vars for managed-node2 37031 1727204408.04569: Calling groups_inventory to load vars for managed-node2 37031 1727204408.04571: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.04583: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.04586: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.04594: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.05951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.07010: done with get_vars() 37031 1727204408.07025: variable 'ansible_search_path' from source: unknown 37031 1727204408.07038: we have included files to process 37031 1727204408.07039: generating all_blocks data 37031 1727204408.07040: done generating all_blocks data 37031 1727204408.07045: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204408.07046: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204408.07047: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 37031 1727204408.07356: in VariableManager get_vars() 37031 1727204408.07378: done with get_vars() 37031 1727204408.07808: done processing included file 37031 1727204408.07810: iterating over new_blocks loaded from include file 37031 1727204408.07811: in VariableManager get_vars() 37031 1727204408.07823: done with get_vars() 37031 1727204408.07824: filtering new block on tags 37031 1727204408.07845: done filtering new block on tags 37031 1727204408.07847: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 37031 1727204408.07851: extending task lists for all hosts with included blocks 37031 1727204408.10606: done extending task lists 37031 1727204408.10608: done processing included files 37031 1727204408.10609: results queue empty 37031 1727204408.10609: checking for any_errors_fatal 37031 1727204408.10611: done checking for any_errors_fatal 37031 1727204408.10612: checking for max_fail_percentage 37031 1727204408.10613: done checking for max_fail_percentage 37031 1727204408.10614: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.10615: done checking to see if all hosts have failed 37031 1727204408.10616: getting the remaining hosts for this loop 37031 1727204408.10617: done getting the remaining hosts for this loop 37031 1727204408.10620: getting the next task for host managed-node2 37031 1727204408.10624: done getting next task for host managed-node2 37031 1727204408.10626: ^ task is: TASK: Ensure state in ["present", "absent"] 37031 1727204408.10629: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.10632: getting variables 37031 1727204408.10633: in VariableManager get_vars() 37031 1727204408.10653: Calling all_inventory to load vars for managed-node2 37031 1727204408.10656: Calling groups_inventory to load vars for managed-node2 37031 1727204408.10661: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.10669: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.10672: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.10675: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.11579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.13177: done with get_vars() 37031 1727204408.13207: done getting variables 37031 1727204408.13259: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.099) 0:00:30.677 ***** 37031 1727204408.13291: entering _queue_task() for managed-node2/fail 37031 1727204408.13644: worker is 1 (out of 1 available) 37031 1727204408.13661: exiting _queue_task() for managed-node2/fail 37031 1727204408.13677: done queuing things up, now waiting for results queue to drain 37031 1727204408.13679: waiting for pending results... 37031 1727204408.13986: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 37031 1727204408.14105: in run() - task 0affcd87-79f5-b754-dfb8-0000000005cc 37031 1727204408.14129: variable 'ansible_search_path' from source: unknown 37031 1727204408.14136: variable 'ansible_search_path' from source: unknown 37031 1727204408.14181: calling self._execute() 37031 1727204408.14283: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.14295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.14309: variable 'omit' from source: magic vars 37031 1727204408.14741: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.14766: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.14915: variable 'state' from source: include params 37031 1727204408.14926: Evaluated conditional (state not in ["present", "absent"]): False 37031 1727204408.14933: when evaluation is False, skipping this task 37031 1727204408.14940: _execute() done 37031 1727204408.14948: dumping result to json 37031 1727204408.14959: done dumping result, returning 37031 1727204408.14973: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [0affcd87-79f5-b754-dfb8-0000000005cc] 37031 1727204408.14984: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cc skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 37031 1727204408.15141: no more pending results, returning what we have 37031 1727204408.15146: results queue empty 37031 1727204408.15147: checking for any_errors_fatal 37031 1727204408.15149: done checking for any_errors_fatal 37031 1727204408.15150: checking for max_fail_percentage 37031 1727204408.15152: done checking for max_fail_percentage 37031 1727204408.15153: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.15154: done checking to see if all hosts have failed 37031 1727204408.15155: getting the remaining hosts for this loop 37031 1727204408.15160: done getting the remaining hosts for this loop 37031 1727204408.15166: getting the next task for host managed-node2 37031 1727204408.15174: done getting next task for host managed-node2 37031 1727204408.15177: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 37031 1727204408.15181: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.15185: getting variables 37031 1727204408.15187: in VariableManager get_vars() 37031 1727204408.15234: Calling all_inventory to load vars for managed-node2 37031 1727204408.15238: Calling groups_inventory to load vars for managed-node2 37031 1727204408.15240: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.15254: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.15260: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.15263: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.16283: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cc 37031 1727204408.16287: WORKER PROCESS EXITING 37031 1727204408.17030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.18749: done with get_vars() 37031 1727204408.19330: done getting variables 37031 1727204408.19403: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.061) 0:00:30.739 ***** 37031 1727204408.19437: entering _queue_task() for managed-node2/fail 37031 1727204408.19870: worker is 1 (out of 1 available) 37031 1727204408.19884: exiting _queue_task() for managed-node2/fail 37031 1727204408.19898: done queuing things up, now waiting for results queue to drain 37031 1727204408.19900: waiting for pending results... 37031 1727204408.20611: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 37031 1727204408.20729: in run() - task 0affcd87-79f5-b754-dfb8-0000000005cd 37031 1727204408.20753: variable 'ansible_search_path' from source: unknown 37031 1727204408.20761: variable 'ansible_search_path' from source: unknown 37031 1727204408.20804: calling self._execute() 37031 1727204408.20909: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.20920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.20933: variable 'omit' from source: magic vars 37031 1727204408.21328: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.21346: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.21526: variable 'type' from source: play vars 37031 1727204408.21537: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 37031 1727204408.21551: when evaluation is False, skipping this task 37031 1727204408.21562: _execute() done 37031 1727204408.21573: dumping result to json 37031 1727204408.21582: done dumping result, returning 37031 1727204408.21592: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [0affcd87-79f5-b754-dfb8-0000000005cd] 37031 1727204408.21612: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cd skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 37031 1727204408.21798: no more pending results, returning what we have 37031 1727204408.21802: results queue empty 37031 1727204408.21804: checking for any_errors_fatal 37031 1727204408.21809: done checking for any_errors_fatal 37031 1727204408.21810: checking for max_fail_percentage 37031 1727204408.21812: done checking for max_fail_percentage 37031 1727204408.21813: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.21815: done checking to see if all hosts have failed 37031 1727204408.21815: getting the remaining hosts for this loop 37031 1727204408.21817: done getting the remaining hosts for this loop 37031 1727204408.21821: getting the next task for host managed-node2 37031 1727204408.21829: done getting next task for host managed-node2 37031 1727204408.21833: ^ task is: TASK: Include the task 'show_interfaces.yml' 37031 1727204408.21838: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.21843: getting variables 37031 1727204408.21845: in VariableManager get_vars() 37031 1727204408.21895: Calling all_inventory to load vars for managed-node2 37031 1727204408.21898: Calling groups_inventory to load vars for managed-node2 37031 1727204408.21901: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.21914: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.21918: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.21921: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.23109: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cd 37031 1727204408.23112: WORKER PROCESS EXITING 37031 1727204408.23745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.24653: done with get_vars() 37031 1727204408.24674: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.053) 0:00:30.792 ***** 37031 1727204408.24748: entering _queue_task() for managed-node2/include_tasks 37031 1727204408.25315: worker is 1 (out of 1 available) 37031 1727204408.25323: exiting _queue_task() for managed-node2/include_tasks 37031 1727204408.25335: done queuing things up, now waiting for results queue to drain 37031 1727204408.25337: waiting for pending results... 37031 1727204408.25359: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 37031 1727204408.25439: in run() - task 0affcd87-79f5-b754-dfb8-0000000005ce 37031 1727204408.25461: variable 'ansible_search_path' from source: unknown 37031 1727204408.25473: variable 'ansible_search_path' from source: unknown 37031 1727204408.25517: calling self._execute() 37031 1727204408.25621: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.25633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.25648: variable 'omit' from source: magic vars 37031 1727204408.26020: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.26060: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.26074: _execute() done 37031 1727204408.26090: dumping result to json 37031 1727204408.26098: done dumping result, returning 37031 1727204408.26109: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-b754-dfb8-0000000005ce] 37031 1727204408.26130: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005ce 37031 1727204408.26294: no more pending results, returning what we have 37031 1727204408.26300: in VariableManager get_vars() 37031 1727204408.26351: Calling all_inventory to load vars for managed-node2 37031 1727204408.26354: Calling groups_inventory to load vars for managed-node2 37031 1727204408.26356: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.26375: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.26378: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.26381: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.26900: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005ce 37031 1727204408.26904: WORKER PROCESS EXITING 37031 1727204408.27374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.28290: done with get_vars() 37031 1727204408.28306: variable 'ansible_search_path' from source: unknown 37031 1727204408.28307: variable 'ansible_search_path' from source: unknown 37031 1727204408.28336: we have included files to process 37031 1727204408.28337: generating all_blocks data 37031 1727204408.28338: done generating all_blocks data 37031 1727204408.28342: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204408.28342: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204408.28344: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 37031 1727204408.28423: in VariableManager get_vars() 37031 1727204408.28443: done with get_vars() 37031 1727204408.28529: done processing included file 37031 1727204408.28531: iterating over new_blocks loaded from include file 37031 1727204408.28532: in VariableManager get_vars() 37031 1727204408.28552: done with get_vars() 37031 1727204408.28554: filtering new block on tags 37031 1727204408.28583: done filtering new block on tags 37031 1727204408.28585: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 37031 1727204408.28589: extending task lists for all hosts with included blocks 37031 1727204408.29075: done extending task lists 37031 1727204408.29077: done processing included files 37031 1727204408.29078: results queue empty 37031 1727204408.29079: checking for any_errors_fatal 37031 1727204408.29082: done checking for any_errors_fatal 37031 1727204408.29083: checking for max_fail_percentage 37031 1727204408.29084: done checking for max_fail_percentage 37031 1727204408.29085: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.29086: done checking to see if all hosts have failed 37031 1727204408.29087: getting the remaining hosts for this loop 37031 1727204408.29088: done getting the remaining hosts for this loop 37031 1727204408.29091: getting the next task for host managed-node2 37031 1727204408.29095: done getting next task for host managed-node2 37031 1727204408.29097: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 37031 1727204408.29100: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.29103: getting variables 37031 1727204408.29104: in VariableManager get_vars() 37031 1727204408.29120: Calling all_inventory to load vars for managed-node2 37031 1727204408.29123: Calling groups_inventory to load vars for managed-node2 37031 1727204408.29125: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.29130: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.29133: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.29135: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.30604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.31513: done with get_vars() 37031 1727204408.31529: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.068) 0:00:30.860 ***** 37031 1727204408.31594: entering _queue_task() for managed-node2/include_tasks 37031 1727204408.31836: worker is 1 (out of 1 available) 37031 1727204408.31850: exiting _queue_task() for managed-node2/include_tasks 37031 1727204408.31867: done queuing things up, now waiting for results queue to drain 37031 1727204408.31869: waiting for pending results... 37031 1727204408.32050: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 37031 1727204408.32133: in run() - task 0affcd87-79f5-b754-dfb8-0000000006e4 37031 1727204408.32142: variable 'ansible_search_path' from source: unknown 37031 1727204408.32145: variable 'ansible_search_path' from source: unknown 37031 1727204408.32178: calling self._execute() 37031 1727204408.32294: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.32307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.32317: variable 'omit' from source: magic vars 37031 1727204408.32746: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.32768: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.32778: _execute() done 37031 1727204408.32785: dumping result to json 37031 1727204408.32793: done dumping result, returning 37031 1727204408.32807: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-b754-dfb8-0000000006e4] 37031 1727204408.32815: sending task result for task 0affcd87-79f5-b754-dfb8-0000000006e4 37031 1727204408.32931: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000006e4 37031 1727204408.32942: WORKER PROCESS EXITING 37031 1727204408.33190: no more pending results, returning what we have 37031 1727204408.33194: in VariableManager get_vars() 37031 1727204408.33233: Calling all_inventory to load vars for managed-node2 37031 1727204408.33237: Calling groups_inventory to load vars for managed-node2 37031 1727204408.33239: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.33249: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.33252: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.33255: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.34904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.36447: done with get_vars() 37031 1727204408.36467: variable 'ansible_search_path' from source: unknown 37031 1727204408.36469: variable 'ansible_search_path' from source: unknown 37031 1727204408.36510: we have included files to process 37031 1727204408.36511: generating all_blocks data 37031 1727204408.36512: done generating all_blocks data 37031 1727204408.36513: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204408.36513: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204408.36515: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 37031 1727204408.36707: done processing included file 37031 1727204408.36709: iterating over new_blocks loaded from include file 37031 1727204408.36710: in VariableManager get_vars() 37031 1727204408.36726: done with get_vars() 37031 1727204408.36727: filtering new block on tags 37031 1727204408.36739: done filtering new block on tags 37031 1727204408.36740: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 37031 1727204408.36744: extending task lists for all hosts with included blocks 37031 1727204408.36887: done extending task lists 37031 1727204408.36888: done processing included files 37031 1727204408.36889: results queue empty 37031 1727204408.36890: checking for any_errors_fatal 37031 1727204408.36893: done checking for any_errors_fatal 37031 1727204408.36894: checking for max_fail_percentage 37031 1727204408.36895: done checking for max_fail_percentage 37031 1727204408.36895: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.36896: done checking to see if all hosts have failed 37031 1727204408.36897: getting the remaining hosts for this loop 37031 1727204408.36898: done getting the remaining hosts for this loop 37031 1727204408.36907: getting the next task for host managed-node2 37031 1727204408.36911: done getting next task for host managed-node2 37031 1727204408.36914: ^ task is: TASK: Gather current interface info 37031 1727204408.36917: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.36920: getting variables 37031 1727204408.36921: in VariableManager get_vars() 37031 1727204408.36936: Calling all_inventory to load vars for managed-node2 37031 1727204408.36938: Calling groups_inventory to load vars for managed-node2 37031 1727204408.36940: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.36945: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.36948: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.36951: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.38539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.40117: done with get_vars() 37031 1727204408.40143: done getting variables 37031 1727204408.40194: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.086) 0:00:30.947 ***** 37031 1727204408.40230: entering _queue_task() for managed-node2/command 37031 1727204408.40568: worker is 1 (out of 1 available) 37031 1727204408.40581: exiting _queue_task() for managed-node2/command 37031 1727204408.40595: done queuing things up, now waiting for results queue to drain 37031 1727204408.40597: waiting for pending results... 37031 1727204408.40895: running TaskExecutor() for managed-node2/TASK: Gather current interface info 37031 1727204408.41004: in run() - task 0affcd87-79f5-b754-dfb8-00000000071b 37031 1727204408.41017: variable 'ansible_search_path' from source: unknown 37031 1727204408.41021: variable 'ansible_search_path' from source: unknown 37031 1727204408.41066: calling self._execute() 37031 1727204408.41162: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.41168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.41177: variable 'omit' from source: magic vars 37031 1727204408.41556: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.41571: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.41578: variable 'omit' from source: magic vars 37031 1727204408.41637: variable 'omit' from source: magic vars 37031 1727204408.41674: variable 'omit' from source: magic vars 37031 1727204408.41718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204408.41754: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204408.41777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204408.41795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.41814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.41847: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204408.41851: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.41854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.41953: Set connection var ansible_connection to ssh 37031 1727204408.41958: Set connection var ansible_shell_type to sh 37031 1727204408.41962: Set connection var ansible_pipelining to False 37031 1727204408.41972: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204408.41980: Set connection var ansible_timeout to 10 37031 1727204408.41983: Set connection var ansible_shell_executable to /bin/sh 37031 1727204408.42011: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.42014: variable 'ansible_connection' from source: unknown 37031 1727204408.42018: variable 'ansible_module_compression' from source: unknown 37031 1727204408.42026: variable 'ansible_shell_type' from source: unknown 37031 1727204408.42029: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.42032: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.42037: variable 'ansible_pipelining' from source: unknown 37031 1727204408.42039: variable 'ansible_timeout' from source: unknown 37031 1727204408.42043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.42191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204408.42200: variable 'omit' from source: magic vars 37031 1727204408.42205: starting attempt loop 37031 1727204408.42208: running the handler 37031 1727204408.42224: _low_level_execute_command(): starting 37031 1727204408.42232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204408.42972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204408.42985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.42997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.43015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.43055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.43061: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204408.43071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.43085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204408.43093: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204408.43099: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204408.43107: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.43123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.43135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.43143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.43151: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204408.43161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.43232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204408.43251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.43262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.43337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.44992: stdout chunk (state=3): >>>/root <<< 37031 1727204408.45094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204408.45185: stderr chunk (state=3): >>><<< 37031 1727204408.45199: stdout chunk (state=3): >>><<< 37031 1727204408.45328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204408.45332: _low_level_execute_command(): starting 37031 1727204408.45335: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025 `" && echo ansible-tmp-1727204408.4523318-39056-252375489873025="` echo /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025 `" ) && sleep 0' 37031 1727204408.45914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204408.45929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.45944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.45963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.46008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.46021: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204408.46038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.46056: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204408.46075: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204408.46089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204408.46102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.46115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.46131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.46143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.46155: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204408.46172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.46247: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204408.46274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.46293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.46366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.48216: stdout chunk (state=3): >>>ansible-tmp-1727204408.4523318-39056-252375489873025=/root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025 <<< 37031 1727204408.48332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204408.48430: stderr chunk (state=3): >>><<< 37031 1727204408.48445: stdout chunk (state=3): >>><<< 37031 1727204408.48588: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204408.4523318-39056-252375489873025=/root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204408.48591: variable 'ansible_module_compression' from source: unknown 37031 1727204408.48684: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204408.48687: variable 'ansible_facts' from source: unknown 37031 1727204408.48751: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/AnsiballZ_command.py 37031 1727204408.48954: Sending initial data 37031 1727204408.48957: Sent initial data (156 bytes) 37031 1727204408.50451: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.50454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.50489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.50492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.50495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.50555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.50572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.50645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.52353: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204408.52385: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204408.52420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmphc7nhtl0 /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/AnsiballZ_command.py <<< 37031 1727204408.52455: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204408.53470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204408.53549: stderr chunk (state=3): >>><<< 37031 1727204408.53552: stdout chunk (state=3): >>><<< 37031 1727204408.53581: done transferring module to remote 37031 1727204408.53593: _low_level_execute_command(): starting 37031 1727204408.53598: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/ /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/AnsiballZ_command.py && sleep 0' 37031 1727204408.54374: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204408.54384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.54395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.54415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.54489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.54496: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204408.54506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.54889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204408.54892: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204408.54895: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204408.54898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.54900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.54903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.54905: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.54907: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204408.54909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.54912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204408.54914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.54916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.54918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.56588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204408.56682: stderr chunk (state=3): >>><<< 37031 1727204408.56690: stdout chunk (state=3): >>><<< 37031 1727204408.56716: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204408.56720: _low_level_execute_command(): starting 37031 1727204408.56722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/AnsiballZ_command.py && sleep 0' 37031 1727204408.57420: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204408.57429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.57439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.57455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.57504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.57511: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204408.57521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.57535: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204408.57542: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204408.57549: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204408.57558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.57575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.57594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.57601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.57608: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204408.57618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.57703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204408.57721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.57733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.57816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.71374: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:08.709505", "end": "2024-09-24 15:00:08.712748", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204408.72586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204408.72649: stderr chunk (state=3): >>><<< 37031 1727204408.72653: stdout chunk (state=3): >>><<< 37031 1727204408.72680: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:00:08.709505", "end": "2024-09-24 15:00:08.712748", "delta": "0:00:00.003243", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204408.72722: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204408.72729: _low_level_execute_command(): starting 37031 1727204408.72735: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204408.4523318-39056-252375489873025/ > /dev/null 2>&1 && sleep 0' 37031 1727204408.73448: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204408.73457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.73480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.73496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.73532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.73539: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204408.73548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.73563: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204408.73572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204408.73584: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204408.73593: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204408.73604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204408.73614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204408.73621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204408.73627: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204408.73636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204408.73720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204408.73737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204408.73748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204408.73823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204408.75622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204408.75698: stderr chunk (state=3): >>><<< 37031 1727204408.75701: stdout chunk (state=3): >>><<< 37031 1727204408.75719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204408.75726: handler run complete 37031 1727204408.75752: Evaluated conditional (False): False 37031 1727204408.75768: attempt loop complete, returning result 37031 1727204408.75771: _execute() done 37031 1727204408.75776: dumping result to json 37031 1727204408.75778: done dumping result, returning 37031 1727204408.75787: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [0affcd87-79f5-b754-dfb8-00000000071b] 37031 1727204408.75792: sending task result for task 0affcd87-79f5-b754-dfb8-00000000071b 37031 1727204408.75902: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000071b 37031 1727204408.75904: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003243", "end": "2024-09-24 15:00:08.712748", "rc": 0, "start": "2024-09-24 15:00:08.709505" } STDOUT: bonding_masters eth0 lo veth0 37031 1727204408.75995: no more pending results, returning what we have 37031 1727204408.75999: results queue empty 37031 1727204408.76000: checking for any_errors_fatal 37031 1727204408.76002: done checking for any_errors_fatal 37031 1727204408.76002: checking for max_fail_percentage 37031 1727204408.76004: done checking for max_fail_percentage 37031 1727204408.76005: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.76006: done checking to see if all hosts have failed 37031 1727204408.76006: getting the remaining hosts for this loop 37031 1727204408.76008: done getting the remaining hosts for this loop 37031 1727204408.76012: getting the next task for host managed-node2 37031 1727204408.76020: done getting next task for host managed-node2 37031 1727204408.76022: ^ task is: TASK: Set current_interfaces 37031 1727204408.76027: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.76032: getting variables 37031 1727204408.76034: in VariableManager get_vars() 37031 1727204408.76078: Calling all_inventory to load vars for managed-node2 37031 1727204408.76081: Calling groups_inventory to load vars for managed-node2 37031 1727204408.76082: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.76092: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.76094: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.76102: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.77628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.79372: done with get_vars() 37031 1727204408.79404: done getting variables 37031 1727204408.79478: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.392) 0:00:31.339 ***** 37031 1727204408.79514: entering _queue_task() for managed-node2/set_fact 37031 1727204408.79858: worker is 1 (out of 1 available) 37031 1727204408.79874: exiting _queue_task() for managed-node2/set_fact 37031 1727204408.79892: done queuing things up, now waiting for results queue to drain 37031 1727204408.79894: waiting for pending results... 37031 1727204408.80185: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 37031 1727204408.80317: in run() - task 0affcd87-79f5-b754-dfb8-00000000071c 37031 1727204408.80341: variable 'ansible_search_path' from source: unknown 37031 1727204408.80350: variable 'ansible_search_path' from source: unknown 37031 1727204408.80395: calling self._execute() 37031 1727204408.80504: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.80515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.80534: variable 'omit' from source: magic vars 37031 1727204408.80947: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.80972: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.80986: variable 'omit' from source: magic vars 37031 1727204408.81044: variable 'omit' from source: magic vars 37031 1727204408.81158: variable '_current_interfaces' from source: set_fact 37031 1727204408.81240: variable 'omit' from source: magic vars 37031 1727204408.81290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204408.81334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204408.81360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204408.81386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.81411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.81447: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204408.81457: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.81467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.81580: Set connection var ansible_connection to ssh 37031 1727204408.81588: Set connection var ansible_shell_type to sh 37031 1727204408.81601: Set connection var ansible_pipelining to False 37031 1727204408.81619: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204408.81635: Set connection var ansible_timeout to 10 37031 1727204408.81646: Set connection var ansible_shell_executable to /bin/sh 37031 1727204408.81679: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.81688: variable 'ansible_connection' from source: unknown 37031 1727204408.81696: variable 'ansible_module_compression' from source: unknown 37031 1727204408.81703: variable 'ansible_shell_type' from source: unknown 37031 1727204408.81710: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.81716: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.81730: variable 'ansible_pipelining' from source: unknown 37031 1727204408.81739: variable 'ansible_timeout' from source: unknown 37031 1727204408.81749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.81900: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204408.81916: variable 'omit' from source: magic vars 37031 1727204408.81929: starting attempt loop 37031 1727204408.81937: running the handler 37031 1727204408.81958: handler run complete 37031 1727204408.81977: attempt loop complete, returning result 37031 1727204408.81984: _execute() done 37031 1727204408.81992: dumping result to json 37031 1727204408.81999: done dumping result, returning 37031 1727204408.82010: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [0affcd87-79f5-b754-dfb8-00000000071c] 37031 1727204408.82017: sending task result for task 0affcd87-79f5-b754-dfb8-00000000071c ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 37031 1727204408.82182: no more pending results, returning what we have 37031 1727204408.82186: results queue empty 37031 1727204408.82187: checking for any_errors_fatal 37031 1727204408.82195: done checking for any_errors_fatal 37031 1727204408.82196: checking for max_fail_percentage 37031 1727204408.82198: done checking for max_fail_percentage 37031 1727204408.82199: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.82200: done checking to see if all hosts have failed 37031 1727204408.82201: getting the remaining hosts for this loop 37031 1727204408.82203: done getting the remaining hosts for this loop 37031 1727204408.82208: getting the next task for host managed-node2 37031 1727204408.82217: done getting next task for host managed-node2 37031 1727204408.82220: ^ task is: TASK: Show current_interfaces 37031 1727204408.82225: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.82229: getting variables 37031 1727204408.82231: in VariableManager get_vars() 37031 1727204408.82283: Calling all_inventory to load vars for managed-node2 37031 1727204408.82286: Calling groups_inventory to load vars for managed-node2 37031 1727204408.82289: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.82301: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.82304: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.82307: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.83339: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000071c 37031 1727204408.83342: WORKER PROCESS EXITING 37031 1727204408.83985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.85598: done with get_vars() 37031 1727204408.85627: done getting variables 37031 1727204408.85695: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.062) 0:00:31.402 ***** 37031 1727204408.85734: entering _queue_task() for managed-node2/debug 37031 1727204408.86088: worker is 1 (out of 1 available) 37031 1727204408.86101: exiting _queue_task() for managed-node2/debug 37031 1727204408.86114: done queuing things up, now waiting for results queue to drain 37031 1727204408.86115: waiting for pending results... 37031 1727204408.86402: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 37031 1727204408.86534: in run() - task 0affcd87-79f5-b754-dfb8-0000000006e5 37031 1727204408.86559: variable 'ansible_search_path' from source: unknown 37031 1727204408.86571: variable 'ansible_search_path' from source: unknown 37031 1727204408.86615: calling self._execute() 37031 1727204408.86716: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.86727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.86742: variable 'omit' from source: magic vars 37031 1727204408.87086: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.87097: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.87103: variable 'omit' from source: magic vars 37031 1727204408.87139: variable 'omit' from source: magic vars 37031 1727204408.87209: variable 'current_interfaces' from source: set_fact 37031 1727204408.87236: variable 'omit' from source: magic vars 37031 1727204408.87270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204408.87297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204408.87313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204408.87327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.87338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.87366: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204408.87370: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.87373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.87439: Set connection var ansible_connection to ssh 37031 1727204408.87442: Set connection var ansible_shell_type to sh 37031 1727204408.87449: Set connection var ansible_pipelining to False 37031 1727204408.87460: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204408.87463: Set connection var ansible_timeout to 10 37031 1727204408.87470: Set connection var ansible_shell_executable to /bin/sh 37031 1727204408.87488: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.87492: variable 'ansible_connection' from source: unknown 37031 1727204408.87495: variable 'ansible_module_compression' from source: unknown 37031 1727204408.87497: variable 'ansible_shell_type' from source: unknown 37031 1727204408.87499: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.87502: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.87504: variable 'ansible_pipelining' from source: unknown 37031 1727204408.87506: variable 'ansible_timeout' from source: unknown 37031 1727204408.87511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.88045: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204408.88048: variable 'omit' from source: magic vars 37031 1727204408.88051: starting attempt loop 37031 1727204408.88053: running the handler 37031 1727204408.88055: handler run complete 37031 1727204408.88058: attempt loop complete, returning result 37031 1727204408.88060: _execute() done 37031 1727204408.88062: dumping result to json 37031 1727204408.88067: done dumping result, returning 37031 1727204408.88073: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [0affcd87-79f5-b754-dfb8-0000000006e5] 37031 1727204408.88076: sending task result for task 0affcd87-79f5-b754-dfb8-0000000006e5 37031 1727204408.88139: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000006e5 37031 1727204408.88142: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 37031 1727204408.88294: no more pending results, returning what we have 37031 1727204408.88297: results queue empty 37031 1727204408.88298: checking for any_errors_fatal 37031 1727204408.88304: done checking for any_errors_fatal 37031 1727204408.88305: checking for max_fail_percentage 37031 1727204408.88306: done checking for max_fail_percentage 37031 1727204408.88307: checking to see if all hosts have failed and the running result is not ok 37031 1727204408.88308: done checking to see if all hosts have failed 37031 1727204408.88309: getting the remaining hosts for this loop 37031 1727204408.88311: done getting the remaining hosts for this loop 37031 1727204408.88315: getting the next task for host managed-node2 37031 1727204408.88323: done getting next task for host managed-node2 37031 1727204408.88326: ^ task is: TASK: Install iproute 37031 1727204408.88329: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204408.88333: getting variables 37031 1727204408.88335: in VariableManager get_vars() 37031 1727204408.88378: Calling all_inventory to load vars for managed-node2 37031 1727204408.88381: Calling groups_inventory to load vars for managed-node2 37031 1727204408.88383: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204408.88393: Calling all_plugins_play to load vars for managed-node2 37031 1727204408.88395: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204408.88398: Calling groups_plugins_play to load vars for managed-node2 37031 1727204408.89903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204408.92825: done with get_vars() 37031 1727204408.92858: done getting variables 37031 1727204408.92920: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:00:08 -0400 (0:00:00.072) 0:00:31.474 ***** 37031 1727204408.92952: entering _queue_task() for managed-node2/package 37031 1727204408.93287: worker is 1 (out of 1 available) 37031 1727204408.93301: exiting _queue_task() for managed-node2/package 37031 1727204408.93316: done queuing things up, now waiting for results queue to drain 37031 1727204408.93318: waiting for pending results... 37031 1727204408.93616: running TaskExecutor() for managed-node2/TASK: Install iproute 37031 1727204408.93713: in run() - task 0affcd87-79f5-b754-dfb8-0000000005cf 37031 1727204408.93726: variable 'ansible_search_path' from source: unknown 37031 1727204408.93731: variable 'ansible_search_path' from source: unknown 37031 1727204408.93772: calling self._execute() 37031 1727204408.93866: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.93877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.93887: variable 'omit' from source: magic vars 37031 1727204408.94340: variable 'ansible_distribution_major_version' from source: facts 37031 1727204408.94357: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204408.94371: variable 'omit' from source: magic vars 37031 1727204408.95078: variable 'omit' from source: magic vars 37031 1727204408.95082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 37031 1727204408.98430: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 37031 1727204408.98511: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 37031 1727204408.98549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 37031 1727204408.98594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 37031 1727204408.98621: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 37031 1727204408.98729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 37031 1727204408.98777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 37031 1727204408.98808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 37031 1727204408.98850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 37031 1727204408.98871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 37031 1727204408.98990: variable '__network_is_ostree' from source: set_fact 37031 1727204408.98994: variable 'omit' from source: magic vars 37031 1727204408.99031: variable 'omit' from source: magic vars 37031 1727204408.99069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204408.99096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204408.99116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204408.99138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.99149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204408.99195: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204408.99199: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.99201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.99369: Set connection var ansible_connection to ssh 37031 1727204408.99372: Set connection var ansible_shell_type to sh 37031 1727204408.99378: Set connection var ansible_pipelining to False 37031 1727204408.99386: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204408.99392: Set connection var ansible_timeout to 10 37031 1727204408.99398: Set connection var ansible_shell_executable to /bin/sh 37031 1727204408.99428: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.99431: variable 'ansible_connection' from source: unknown 37031 1727204408.99434: variable 'ansible_module_compression' from source: unknown 37031 1727204408.99436: variable 'ansible_shell_type' from source: unknown 37031 1727204408.99438: variable 'ansible_shell_executable' from source: unknown 37031 1727204408.99440: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204408.99442: variable 'ansible_pipelining' from source: unknown 37031 1727204408.99446: variable 'ansible_timeout' from source: unknown 37031 1727204408.99450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204408.99551: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204408.99570: variable 'omit' from source: magic vars 37031 1727204408.99578: starting attempt loop 37031 1727204408.99581: running the handler 37031 1727204408.99588: variable 'ansible_facts' from source: unknown 37031 1727204408.99591: variable 'ansible_facts' from source: unknown 37031 1727204408.99625: _low_level_execute_command(): starting 37031 1727204408.99633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204409.01679: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204409.01691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.01701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.01715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.01811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.01818: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204409.01828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.01840: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204409.01848: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204409.01854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204409.01861: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.01873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.01885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.01894: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.01901: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204409.01914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.01986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204409.02005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204409.02018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204409.02093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204409.03783: stdout chunk (state=3): >>>/root <<< 37031 1727204409.03949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204409.03953: stdout chunk (state=3): >>><<< 37031 1727204409.03966: stderr chunk (state=3): >>><<< 37031 1727204409.03987: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204409.04000: _low_level_execute_command(): starting 37031 1727204409.04007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532 `" && echo ansible-tmp-1727204409.0398786-39078-224329521589532="` echo /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532 `" ) && sleep 0' 37031 1727204409.04630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204409.04641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.04650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.04666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.04707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.04713: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204409.04723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.04735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204409.04743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204409.04749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204409.04759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.04767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.04778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.04785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.04793: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204409.04800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.04877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204409.04891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204409.04905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204409.04978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204409.06831: stdout chunk (state=3): >>>ansible-tmp-1727204409.0398786-39078-224329521589532=/root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532 <<< 37031 1727204409.06954: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204409.07047: stderr chunk (state=3): >>><<< 37031 1727204409.07063: stdout chunk (state=3): >>><<< 37031 1727204409.07373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204409.0398786-39078-224329521589532=/root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204409.07377: variable 'ansible_module_compression' from source: unknown 37031 1727204409.07379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 37031 1727204409.07381: variable 'ansible_facts' from source: unknown 37031 1727204409.07383: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/AnsiballZ_dnf.py 37031 1727204409.07498: Sending initial data 37031 1727204409.07501: Sent initial data (152 bytes) 37031 1727204409.08526: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204409.08545: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.08569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.08596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.08645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.08663: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204409.08682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.08707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204409.08725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204409.08739: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204409.08753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.08774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.08790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.08806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.08821: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204409.08837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.08914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204409.08944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204409.08965: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204409.09037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204409.10777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204409.10808: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204409.10851: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmp9vqwbqtx /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/AnsiballZ_dnf.py <<< 37031 1727204409.10888: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204409.12281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204409.12579: stderr chunk (state=3): >>><<< 37031 1727204409.12583: stdout chunk (state=3): >>><<< 37031 1727204409.12585: done transferring module to remote 37031 1727204409.12587: _low_level_execute_command(): starting 37031 1727204409.12590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/ /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/AnsiballZ_dnf.py && sleep 0' 37031 1727204409.13211: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204409.13232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.13250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.13276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.13320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.13332: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204409.13352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.13376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204409.13389: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204409.13400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204409.13411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.13423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.13438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.13455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.13472: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204409.13487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.13561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204409.13589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204409.13605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204409.13677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204409.15513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204409.15517: stdout chunk (state=3): >>><<< 37031 1727204409.15519: stderr chunk (state=3): >>><<< 37031 1727204409.15625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204409.15630: _low_level_execute_command(): starting 37031 1727204409.15634: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/AnsiballZ_dnf.py && sleep 0' 37031 1727204409.16334: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204409.16349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.16370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.16391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.16443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.16459: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204409.16478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.16498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204409.16510: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204409.16530: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204409.16544: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204409.16562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204409.16582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204409.16595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204409.16606: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204409.16620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204409.16712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204409.16768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204409.16786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204409.16878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.10253: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 37031 1727204410.14870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204410.14875: stdout chunk (state=3): >>><<< 37031 1727204410.14878: stderr chunk (state=3): >>><<< 37031 1727204410.14970: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204410.14976: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204410.14979: _low_level_execute_command(): starting 37031 1727204410.14981: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204409.0398786-39078-224329521589532/ > /dev/null 2>&1 && sleep 0' 37031 1727204410.15681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204410.15698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.15714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.15732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.15787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.15799: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204410.15812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.15830: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204410.15841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204410.15854: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204410.15875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.15890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.15906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.15919: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.15930: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204410.15943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.16027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204410.16051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.16071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.16148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.17989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.18078: stderr chunk (state=3): >>><<< 37031 1727204410.18094: stdout chunk (state=3): >>><<< 37031 1727204410.18169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204410.18178: handler run complete 37031 1727204410.18472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 37031 1727204410.18511: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 37031 1727204410.18554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 37031 1727204410.18602: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 37031 1727204410.18651: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 37031 1727204410.18739: variable '__install_status' from source: set_fact 37031 1727204410.18769: Evaluated conditional (__install_status is success): True 37031 1727204410.18793: attempt loop complete, returning result 37031 1727204410.18807: _execute() done 37031 1727204410.18815: dumping result to json 37031 1727204410.18824: done dumping result, returning 37031 1727204410.18837: done running TaskExecutor() for managed-node2/TASK: Install iproute [0affcd87-79f5-b754-dfb8-0000000005cf] 37031 1727204410.18847: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cf ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 37031 1727204410.19062: no more pending results, returning what we have 37031 1727204410.19067: results queue empty 37031 1727204410.19069: checking for any_errors_fatal 37031 1727204410.19077: done checking for any_errors_fatal 37031 1727204410.19078: checking for max_fail_percentage 37031 1727204410.19080: done checking for max_fail_percentage 37031 1727204410.19081: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.19082: done checking to see if all hosts have failed 37031 1727204410.19082: getting the remaining hosts for this loop 37031 1727204410.19084: done getting the remaining hosts for this loop 37031 1727204410.19089: getting the next task for host managed-node2 37031 1727204410.19096: done getting next task for host managed-node2 37031 1727204410.19099: ^ task is: TASK: Create veth interface {{ interface }} 37031 1727204410.19102: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.19107: getting variables 37031 1727204410.19109: in VariableManager get_vars() 37031 1727204410.19153: Calling all_inventory to load vars for managed-node2 37031 1727204410.19155: Calling groups_inventory to load vars for managed-node2 37031 1727204410.19161: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.19174: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.19178: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.19181: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.20215: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005cf 37031 1727204410.20219: WORKER PROCESS EXITING 37031 1727204410.21045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.22949: done with get_vars() 37031 1727204410.22983: done getting variables 37031 1727204410.23045: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204410.23185: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:00:10 -0400 (0:00:01.302) 0:00:32.777 ***** 37031 1727204410.23217: entering _queue_task() for managed-node2/command 37031 1727204410.23588: worker is 1 (out of 1 available) 37031 1727204410.23606: exiting _queue_task() for managed-node2/command 37031 1727204410.23620: done queuing things up, now waiting for results queue to drain 37031 1727204410.23621: waiting for pending results... 37031 1727204410.23930: running TaskExecutor() for managed-node2/TASK: Create veth interface veth0 37031 1727204410.24052: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d0 37031 1727204410.24083: variable 'ansible_search_path' from source: unknown 37031 1727204410.24091: variable 'ansible_search_path' from source: unknown 37031 1727204410.24393: variable 'interface' from source: play vars 37031 1727204410.24500: variable 'interface' from source: play vars 37031 1727204410.24588: variable 'interface' from source: play vars 37031 1727204410.24754: Loaded config def from plugin (lookup/items) 37031 1727204410.24772: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 37031 1727204410.24800: variable 'omit' from source: magic vars 37031 1727204410.24954: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.24973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.24988: variable 'omit' from source: magic vars 37031 1727204410.25245: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.25269: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.25490: variable 'type' from source: play vars 37031 1727204410.25499: variable 'state' from source: include params 37031 1727204410.25507: variable 'interface' from source: play vars 37031 1727204410.25515: variable 'current_interfaces' from source: set_fact 37031 1727204410.25525: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 37031 1727204410.25532: when evaluation is False, skipping this task 37031 1727204410.25569: variable 'item' from source: unknown 37031 1727204410.25648: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 37031 1727204410.25896: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.25911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.25923: variable 'omit' from source: magic vars 37031 1727204410.26101: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.26113: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.26324: variable 'type' from source: play vars 37031 1727204410.26333: variable 'state' from source: include params 37031 1727204410.26341: variable 'interface' from source: play vars 37031 1727204410.26348: variable 'current_interfaces' from source: set_fact 37031 1727204410.26361: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 37031 1727204410.26372: when evaluation is False, skipping this task 37031 1727204410.26401: variable 'item' from source: unknown 37031 1727204410.26476: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 37031 1727204410.26630: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.26643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.26655: variable 'omit' from source: magic vars 37031 1727204410.26827: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.26839: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.27047: variable 'type' from source: play vars 37031 1727204410.27060: variable 'state' from source: include params 37031 1727204410.27072: variable 'interface' from source: play vars 37031 1727204410.27080: variable 'current_interfaces' from source: set_fact 37031 1727204410.27090: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 37031 1727204410.27096: when evaluation is False, skipping this task 37031 1727204410.27134: variable 'item' from source: unknown 37031 1727204410.27202: variable 'item' from source: unknown skipping: [managed-node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 37031 1727204410.27301: dumping result to json 37031 1727204410.27310: done dumping result, returning 37031 1727204410.27319: done running TaskExecutor() for managed-node2/TASK: Create veth interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d0] 37031 1727204410.27332: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d0 skipping: [managed-node2] => { "changed": false } MSG: All items skipped 37031 1727204410.27439: no more pending results, returning what we have 37031 1727204410.27443: results queue empty 37031 1727204410.27445: checking for any_errors_fatal 37031 1727204410.27452: done checking for any_errors_fatal 37031 1727204410.27453: checking for max_fail_percentage 37031 1727204410.27454: done checking for max_fail_percentage 37031 1727204410.27455: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.27458: done checking to see if all hosts have failed 37031 1727204410.27459: getting the remaining hosts for this loop 37031 1727204410.27461: done getting the remaining hosts for this loop 37031 1727204410.27467: getting the next task for host managed-node2 37031 1727204410.27474: done getting next task for host managed-node2 37031 1727204410.27478: ^ task is: TASK: Set up veth as managed by NetworkManager 37031 1727204410.27482: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.27485: getting variables 37031 1727204410.27487: in VariableManager get_vars() 37031 1727204410.27533: Calling all_inventory to load vars for managed-node2 37031 1727204410.27536: Calling groups_inventory to load vars for managed-node2 37031 1727204410.27538: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.27551: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.27554: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.27559: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.28665: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d0 37031 1727204410.28669: WORKER PROCESS EXITING 37031 1727204410.29383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.31172: done with get_vars() 37031 1727204410.31202: done getting variables 37031 1727204410.31272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:00:10 -0400 (0:00:00.080) 0:00:32.857 ***** 37031 1727204410.31310: entering _queue_task() for managed-node2/command 37031 1727204410.31651: worker is 1 (out of 1 available) 37031 1727204410.31668: exiting _queue_task() for managed-node2/command 37031 1727204410.31681: done queuing things up, now waiting for results queue to drain 37031 1727204410.31683: waiting for pending results... 37031 1727204410.31986: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 37031 1727204410.32105: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d1 37031 1727204410.32130: variable 'ansible_search_path' from source: unknown 37031 1727204410.32140: variable 'ansible_search_path' from source: unknown 37031 1727204410.32185: calling self._execute() 37031 1727204410.32286: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.32297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.32311: variable 'omit' from source: magic vars 37031 1727204410.32702: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.32719: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.32894: variable 'type' from source: play vars 37031 1727204410.32904: variable 'state' from source: include params 37031 1727204410.32913: Evaluated conditional (type == 'veth' and state == 'present'): False 37031 1727204410.32920: when evaluation is False, skipping this task 37031 1727204410.32926: _execute() done 37031 1727204410.32933: dumping result to json 37031 1727204410.32939: done dumping result, returning 37031 1727204410.32948: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [0affcd87-79f5-b754-dfb8-0000000005d1] 37031 1727204410.32956: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d1 37031 1727204410.33070: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d1 37031 1727204410.33078: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 37031 1727204410.33136: no more pending results, returning what we have 37031 1727204410.33141: results queue empty 37031 1727204410.33142: checking for any_errors_fatal 37031 1727204410.33155: done checking for any_errors_fatal 37031 1727204410.33155: checking for max_fail_percentage 37031 1727204410.33160: done checking for max_fail_percentage 37031 1727204410.33161: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.33162: done checking to see if all hosts have failed 37031 1727204410.33163: getting the remaining hosts for this loop 37031 1727204410.33166: done getting the remaining hosts for this loop 37031 1727204410.33171: getting the next task for host managed-node2 37031 1727204410.33178: done getting next task for host managed-node2 37031 1727204410.33180: ^ task is: TASK: Delete veth interface {{ interface }} 37031 1727204410.33184: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.33188: getting variables 37031 1727204410.33190: in VariableManager get_vars() 37031 1727204410.33236: Calling all_inventory to load vars for managed-node2 37031 1727204410.33238: Calling groups_inventory to load vars for managed-node2 37031 1727204410.33241: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.33254: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.33259: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.33262: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.35125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.36815: done with get_vars() 37031 1727204410.36846: done getting variables 37031 1727204410.36913: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204410.37036: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:00:10 -0400 (0:00:00.057) 0:00:32.915 ***** 37031 1727204410.37075: entering _queue_task() for managed-node2/command 37031 1727204410.37406: worker is 1 (out of 1 available) 37031 1727204410.37420: exiting _queue_task() for managed-node2/command 37031 1727204410.37434: done queuing things up, now waiting for results queue to drain 37031 1727204410.37435: waiting for pending results... 37031 1727204410.37741: running TaskExecutor() for managed-node2/TASK: Delete veth interface veth0 37031 1727204410.37862: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d2 37031 1727204410.37888: variable 'ansible_search_path' from source: unknown 37031 1727204410.37897: variable 'ansible_search_path' from source: unknown 37031 1727204410.37944: calling self._execute() 37031 1727204410.38050: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.38070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.38085: variable 'omit' from source: magic vars 37031 1727204410.38496: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.38514: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.38744: variable 'type' from source: play vars 37031 1727204410.38759: variable 'state' from source: include params 37031 1727204410.38772: variable 'interface' from source: play vars 37031 1727204410.38780: variable 'current_interfaces' from source: set_fact 37031 1727204410.38791: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 37031 1727204410.38801: variable 'omit' from source: magic vars 37031 1727204410.38848: variable 'omit' from source: magic vars 37031 1727204410.38962: variable 'interface' from source: play vars 37031 1727204410.38988: variable 'omit' from source: magic vars 37031 1727204410.39036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204410.39082: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204410.39107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204410.39127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204410.39147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204410.39185: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204410.39196: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.39203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.39312: Set connection var ansible_connection to ssh 37031 1727204410.39319: Set connection var ansible_shell_type to sh 37031 1727204410.39331: Set connection var ansible_pipelining to False 37031 1727204410.39342: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204410.39355: Set connection var ansible_timeout to 10 37031 1727204410.39373: Set connection var ansible_shell_executable to /bin/sh 37031 1727204410.39406: variable 'ansible_shell_executable' from source: unknown 37031 1727204410.39417: variable 'ansible_connection' from source: unknown 37031 1727204410.39424: variable 'ansible_module_compression' from source: unknown 37031 1727204410.39431: variable 'ansible_shell_type' from source: unknown 37031 1727204410.39437: variable 'ansible_shell_executable' from source: unknown 37031 1727204410.39444: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.39451: variable 'ansible_pipelining' from source: unknown 37031 1727204410.39459: variable 'ansible_timeout' from source: unknown 37031 1727204410.39473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.39619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204410.39640: variable 'omit' from source: magic vars 37031 1727204410.39650: starting attempt loop 37031 1727204410.39660: running the handler 37031 1727204410.39681: _low_level_execute_command(): starting 37031 1727204410.39699: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204410.40518: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204410.40534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.40549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.40573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.40625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.40638: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204410.40651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.40676: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204410.40693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204410.40706: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204410.40722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.40736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.40752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.40771: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.40783: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204410.40798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.40882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204410.40909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.40932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.41007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.42615: stdout chunk (state=3): >>>/root <<< 37031 1727204410.42723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.42817: stderr chunk (state=3): >>><<< 37031 1727204410.42836: stdout chunk (state=3): >>><<< 37031 1727204410.42967: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204410.42979: _low_level_execute_command(): starting 37031 1727204410.42982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665 `" && echo ansible-tmp-1727204410.4287112-39253-42267149615665="` echo /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665 `" ) && sleep 0' 37031 1727204410.44191: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.44194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.44229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.44232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.44235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.44307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.44321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.44401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.46251: stdout chunk (state=3): >>>ansible-tmp-1727204410.4287112-39253-42267149615665=/root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665 <<< 37031 1727204410.46367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.46448: stderr chunk (state=3): >>><<< 37031 1727204410.46462: stdout chunk (state=3): >>><<< 37031 1727204410.46772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204410.4287112-39253-42267149615665=/root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204410.46775: variable 'ansible_module_compression' from source: unknown 37031 1727204410.46778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204410.46780: variable 'ansible_facts' from source: unknown 37031 1727204410.46782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/AnsiballZ_command.py 37031 1727204410.48296: Sending initial data 37031 1727204410.48299: Sent initial data (155 bytes) 37031 1727204410.50453: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204410.50484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.50498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.50512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.50588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.50606: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204410.50616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.50643: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204410.50679: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204410.50686: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204410.50694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.50705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.50720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.50728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.50735: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204410.50745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.50819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204410.50945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.50959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.51079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.52761: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204410.52798: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204410.52837: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpqe23fa_s /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/AnsiballZ_command.py <<< 37031 1727204410.52874: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204410.54149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.54239: stderr chunk (state=3): >>><<< 37031 1727204410.54242: stdout chunk (state=3): >>><<< 37031 1727204410.54268: done transferring module to remote 37031 1727204410.54282: _low_level_execute_command(): starting 37031 1727204410.54285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/ /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/AnsiballZ_command.py && sleep 0' 37031 1727204410.56078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204410.56087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.56097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.56125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.56194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.56237: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204410.56245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.56262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204410.56286: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204410.56293: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204410.56301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.56309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.56341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.56349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.56359: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204410.56380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.56549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204410.56572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.56581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.56741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.58512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.58516: stdout chunk (state=3): >>><<< 37031 1727204410.58543: stderr chunk (state=3): >>><<< 37031 1727204410.58547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204410.58550: _low_level_execute_command(): starting 37031 1727204410.58553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/AnsiballZ_command.py && sleep 0' 37031 1727204410.59316: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204410.59327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.59334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.59348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.59387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.59395: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204410.59408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.59421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204410.59429: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204410.59436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204410.59441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.59450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.59462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.59472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204410.59478: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204410.59487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.59554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204410.59573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.59576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.59659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.74145: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:00:10.726039", "end": "2024-09-24 15:00:10.740652", "delta": "0:00:00.014613", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204410.75559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204410.75563: stdout chunk (state=3): >>><<< 37031 1727204410.75567: stderr chunk (state=3): >>><<< 37031 1727204410.75711: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-24 15:00:10.726039", "end": "2024-09-24 15:00:10.740652", "delta": "0:00:00.014613", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204410.75721: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204410.75724: _low_level_execute_command(): starting 37031 1727204410.75726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204410.4287112-39253-42267149615665/ > /dev/null 2>&1 && sleep 0' 37031 1727204410.76320: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204410.76324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204410.76360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204410.76366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204410.76369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204410.76436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204410.76451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204410.76530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204410.78307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204410.78392: stderr chunk (state=3): >>><<< 37031 1727204410.78407: stdout chunk (state=3): >>><<< 37031 1727204410.78474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204410.78477: handler run complete 37031 1727204410.78675: Evaluated conditional (False): False 37031 1727204410.78678: attempt loop complete, returning result 37031 1727204410.78680: _execute() done 37031 1727204410.78683: dumping result to json 37031 1727204410.78685: done dumping result, returning 37031 1727204410.78687: done running TaskExecutor() for managed-node2/TASK: Delete veth interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d2] 37031 1727204410.78689: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d2 37031 1727204410.78767: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d2 37031 1727204410.78771: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.014613", "end": "2024-09-24 15:00:10.740652", "rc": 0, "start": "2024-09-24 15:00:10.726039" } 37031 1727204410.78848: no more pending results, returning what we have 37031 1727204410.78853: results queue empty 37031 1727204410.78854: checking for any_errors_fatal 37031 1727204410.78868: done checking for any_errors_fatal 37031 1727204410.78870: checking for max_fail_percentage 37031 1727204410.78872: done checking for max_fail_percentage 37031 1727204410.78873: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.78874: done checking to see if all hosts have failed 37031 1727204410.78875: getting the remaining hosts for this loop 37031 1727204410.78877: done getting the remaining hosts for this loop 37031 1727204410.78881: getting the next task for host managed-node2 37031 1727204410.78890: done getting next task for host managed-node2 37031 1727204410.78893: ^ task is: TASK: Create dummy interface {{ interface }} 37031 1727204410.78897: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.78901: getting variables 37031 1727204410.78904: in VariableManager get_vars() 37031 1727204410.78950: Calling all_inventory to load vars for managed-node2 37031 1727204410.78954: Calling groups_inventory to load vars for managed-node2 37031 1727204410.78959: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.78974: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.78977: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.78981: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.80292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.81199: done with get_vars() 37031 1727204410.81215: done getting variables 37031 1727204410.81266: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204410.81346: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:00:10 -0400 (0:00:00.442) 0:00:33.358 ***** 37031 1727204410.81374: entering _queue_task() for managed-node2/command 37031 1727204410.81655: worker is 1 (out of 1 available) 37031 1727204410.81672: exiting _queue_task() for managed-node2/command 37031 1727204410.81685: done queuing things up, now waiting for results queue to drain 37031 1727204410.81686: waiting for pending results... 37031 1727204410.81987: running TaskExecutor() for managed-node2/TASK: Create dummy interface veth0 37031 1727204410.82104: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d3 37031 1727204410.82127: variable 'ansible_search_path' from source: unknown 37031 1727204410.82137: variable 'ansible_search_path' from source: unknown 37031 1727204410.82183: calling self._execute() 37031 1727204410.82283: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.82294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.82309: variable 'omit' from source: magic vars 37031 1727204410.82669: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.82680: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.82822: variable 'type' from source: play vars 37031 1727204410.82825: variable 'state' from source: include params 37031 1727204410.82828: variable 'interface' from source: play vars 37031 1727204410.82832: variable 'current_interfaces' from source: set_fact 37031 1727204410.82840: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 37031 1727204410.82842: when evaluation is False, skipping this task 37031 1727204410.82845: _execute() done 37031 1727204410.82849: dumping result to json 37031 1727204410.82852: done dumping result, returning 37031 1727204410.82861: done running TaskExecutor() for managed-node2/TASK: Create dummy interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d3] 37031 1727204410.82863: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d3 37031 1727204410.82944: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d3 37031 1727204410.82947: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204410.82998: no more pending results, returning what we have 37031 1727204410.83002: results queue empty 37031 1727204410.83004: checking for any_errors_fatal 37031 1727204410.83011: done checking for any_errors_fatal 37031 1727204410.83012: checking for max_fail_percentage 37031 1727204410.83014: done checking for max_fail_percentage 37031 1727204410.83015: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.83016: done checking to see if all hosts have failed 37031 1727204410.83017: getting the remaining hosts for this loop 37031 1727204410.83018: done getting the remaining hosts for this loop 37031 1727204410.83023: getting the next task for host managed-node2 37031 1727204410.83028: done getting next task for host managed-node2 37031 1727204410.83031: ^ task is: TASK: Delete dummy interface {{ interface }} 37031 1727204410.83035: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.83039: getting variables 37031 1727204410.83040: in VariableManager get_vars() 37031 1727204410.83090: Calling all_inventory to load vars for managed-node2 37031 1727204410.83092: Calling groups_inventory to load vars for managed-node2 37031 1727204410.83094: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.83103: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.83105: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.83107: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.84004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.85224: done with get_vars() 37031 1727204410.85248: done getting variables 37031 1727204410.85309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204410.85420: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:00:10 -0400 (0:00:00.040) 0:00:33.399 ***** 37031 1727204410.85452: entering _queue_task() for managed-node2/command 37031 1727204410.85760: worker is 1 (out of 1 available) 37031 1727204410.85774: exiting _queue_task() for managed-node2/command 37031 1727204410.85787: done queuing things up, now waiting for results queue to drain 37031 1727204410.85788: waiting for pending results... 37031 1727204410.86078: running TaskExecutor() for managed-node2/TASK: Delete dummy interface veth0 37031 1727204410.86202: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d4 37031 1727204410.86227: variable 'ansible_search_path' from source: unknown 37031 1727204410.86238: variable 'ansible_search_path' from source: unknown 37031 1727204410.86281: calling self._execute() 37031 1727204410.86385: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.86397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.86411: variable 'omit' from source: magic vars 37031 1727204410.86785: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.86804: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.87008: variable 'type' from source: play vars 37031 1727204410.87019: variable 'state' from source: include params 37031 1727204410.87027: variable 'interface' from source: play vars 37031 1727204410.87035: variable 'current_interfaces' from source: set_fact 37031 1727204410.87045: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 37031 1727204410.87051: when evaluation is False, skipping this task 37031 1727204410.87057: _execute() done 37031 1727204410.87066: dumping result to json 37031 1727204410.87074: done dumping result, returning 37031 1727204410.87084: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d4] 37031 1727204410.87092: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d4 37031 1727204410.87197: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d4 37031 1727204410.87204: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204410.87254: no more pending results, returning what we have 37031 1727204410.87258: results queue empty 37031 1727204410.87259: checking for any_errors_fatal 37031 1727204410.87265: done checking for any_errors_fatal 37031 1727204410.87266: checking for max_fail_percentage 37031 1727204410.87268: done checking for max_fail_percentage 37031 1727204410.87269: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.87270: done checking to see if all hosts have failed 37031 1727204410.87271: getting the remaining hosts for this loop 37031 1727204410.87273: done getting the remaining hosts for this loop 37031 1727204410.87277: getting the next task for host managed-node2 37031 1727204410.87284: done getting next task for host managed-node2 37031 1727204410.87287: ^ task is: TASK: Create tap interface {{ interface }} 37031 1727204410.87292: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.87297: getting variables 37031 1727204410.87299: in VariableManager get_vars() 37031 1727204410.87343: Calling all_inventory to load vars for managed-node2 37031 1727204410.87346: Calling groups_inventory to load vars for managed-node2 37031 1727204410.87349: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.87362: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.87366: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.87370: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.89054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204410.91536: done with get_vars() 37031 1727204410.91561: done getting variables 37031 1727204410.91624: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204410.91735: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:00:10 -0400 (0:00:00.063) 0:00:33.462 ***** 37031 1727204410.91768: entering _queue_task() for managed-node2/command 37031 1727204410.92090: worker is 1 (out of 1 available) 37031 1727204410.92103: exiting _queue_task() for managed-node2/command 37031 1727204410.92116: done queuing things up, now waiting for results queue to drain 37031 1727204410.92118: waiting for pending results... 37031 1727204410.92402: running TaskExecutor() for managed-node2/TASK: Create tap interface veth0 37031 1727204410.92517: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d5 37031 1727204410.92537: variable 'ansible_search_path' from source: unknown 37031 1727204410.92544: variable 'ansible_search_path' from source: unknown 37031 1727204410.92589: calling self._execute() 37031 1727204410.92685: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204410.92697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204410.92710: variable 'omit' from source: magic vars 37031 1727204410.93075: variable 'ansible_distribution_major_version' from source: facts 37031 1727204410.93093: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204410.93422: variable 'type' from source: play vars 37031 1727204410.93431: variable 'state' from source: include params 37031 1727204410.93439: variable 'interface' from source: play vars 37031 1727204410.93451: variable 'current_interfaces' from source: set_fact 37031 1727204410.93461: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 37031 1727204410.93472: when evaluation is False, skipping this task 37031 1727204410.93479: _execute() done 37031 1727204410.93487: dumping result to json 37031 1727204410.93495: done dumping result, returning 37031 1727204410.93503: done running TaskExecutor() for managed-node2/TASK: Create tap interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d5] 37031 1727204410.93511: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d5 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204410.93653: no more pending results, returning what we have 37031 1727204410.93657: results queue empty 37031 1727204410.93658: checking for any_errors_fatal 37031 1727204410.93666: done checking for any_errors_fatal 37031 1727204410.93667: checking for max_fail_percentage 37031 1727204410.93669: done checking for max_fail_percentage 37031 1727204410.93670: checking to see if all hosts have failed and the running result is not ok 37031 1727204410.93671: done checking to see if all hosts have failed 37031 1727204410.93671: getting the remaining hosts for this loop 37031 1727204410.93673: done getting the remaining hosts for this loop 37031 1727204410.93677: getting the next task for host managed-node2 37031 1727204410.93685: done getting next task for host managed-node2 37031 1727204410.93687: ^ task is: TASK: Delete tap interface {{ interface }} 37031 1727204410.93691: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204410.93696: getting variables 37031 1727204410.93698: in VariableManager get_vars() 37031 1727204410.93743: Calling all_inventory to load vars for managed-node2 37031 1727204410.93746: Calling groups_inventory to load vars for managed-node2 37031 1727204410.93748: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204410.93761: Calling all_plugins_play to load vars for managed-node2 37031 1727204410.93768: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204410.93771: Calling groups_plugins_play to load vars for managed-node2 37031 1727204410.99553: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d5 37031 1727204410.99557: WORKER PROCESS EXITING 37031 1727204411.00466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.02918: done with get_vars() 37031 1727204411.02945: done getting variables 37031 1727204411.02997: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 37031 1727204411.03104: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:00:11 -0400 (0:00:00.113) 0:00:33.576 ***** 37031 1727204411.03131: entering _queue_task() for managed-node2/command 37031 1727204411.03463: worker is 1 (out of 1 available) 37031 1727204411.03478: exiting _queue_task() for managed-node2/command 37031 1727204411.03492: done queuing things up, now waiting for results queue to drain 37031 1727204411.03494: waiting for pending results... 37031 1727204411.03781: running TaskExecutor() for managed-node2/TASK: Delete tap interface veth0 37031 1727204411.03895: in run() - task 0affcd87-79f5-b754-dfb8-0000000005d6 37031 1727204411.03914: variable 'ansible_search_path' from source: unknown 37031 1727204411.03923: variable 'ansible_search_path' from source: unknown 37031 1727204411.03968: calling self._execute() 37031 1727204411.04065: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.04076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.04089: variable 'omit' from source: magic vars 37031 1727204411.04458: variable 'ansible_distribution_major_version' from source: facts 37031 1727204411.04481: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204411.04704: variable 'type' from source: play vars 37031 1727204411.04716: variable 'state' from source: include params 37031 1727204411.04724: variable 'interface' from source: play vars 37031 1727204411.04731: variable 'current_interfaces' from source: set_fact 37031 1727204411.04741: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 37031 1727204411.04747: when evaluation is False, skipping this task 37031 1727204411.04753: _execute() done 37031 1727204411.04759: dumping result to json 37031 1727204411.04768: done dumping result, returning 37031 1727204411.04882: done running TaskExecutor() for managed-node2/TASK: Delete tap interface veth0 [0affcd87-79f5-b754-dfb8-0000000005d6] 37031 1727204411.04896: sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d6 37031 1727204411.04996: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000005d6 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 37031 1727204411.05042: no more pending results, returning what we have 37031 1727204411.05046: results queue empty 37031 1727204411.05048: checking for any_errors_fatal 37031 1727204411.05053: done checking for any_errors_fatal 37031 1727204411.05054: checking for max_fail_percentage 37031 1727204411.05055: done checking for max_fail_percentage 37031 1727204411.05056: checking to see if all hosts have failed and the running result is not ok 37031 1727204411.05057: done checking to see if all hosts have failed 37031 1727204411.05058: getting the remaining hosts for this loop 37031 1727204411.05060: done getting the remaining hosts for this loop 37031 1727204411.05068: getting the next task for host managed-node2 37031 1727204411.05076: done getting next task for host managed-node2 37031 1727204411.05080: ^ task is: TASK: Clean up namespace 37031 1727204411.05082: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204411.05087: getting variables 37031 1727204411.05089: in VariableManager get_vars() 37031 1727204411.05134: Calling all_inventory to load vars for managed-node2 37031 1727204411.05137: Calling groups_inventory to load vars for managed-node2 37031 1727204411.05140: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204411.05152: Calling all_plugins_play to load vars for managed-node2 37031 1727204411.05155: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204411.05158: Calling groups_plugins_play to load vars for managed-node2 37031 1727204411.06381: WORKER PROCESS EXITING 37031 1727204411.07574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.09991: done with get_vars() 37031 1727204411.10025: done getting variables 37031 1727204411.10097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Tuesday 24 September 2024 15:00:11 -0400 (0:00:00.069) 0:00:33.646 ***** 37031 1727204411.10135: entering _queue_task() for managed-node2/command 37031 1727204411.10517: worker is 1 (out of 1 available) 37031 1727204411.10548: exiting _queue_task() for managed-node2/command 37031 1727204411.10573: done queuing things up, now waiting for results queue to drain 37031 1727204411.10575: waiting for pending results... 37031 1727204411.10977: running TaskExecutor() for managed-node2/TASK: Clean up namespace 37031 1727204411.11095: in run() - task 0affcd87-79f5-b754-dfb8-0000000000b4 37031 1727204411.11117: variable 'ansible_search_path' from source: unknown 37031 1727204411.11177: calling self._execute() 37031 1727204411.11289: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.11300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.11313: variable 'omit' from source: magic vars 37031 1727204411.11751: variable 'ansible_distribution_major_version' from source: facts 37031 1727204411.11773: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204411.11797: variable 'omit' from source: magic vars 37031 1727204411.11813: variable 'omit' from source: magic vars 37031 1727204411.11851: variable 'omit' from source: magic vars 37031 1727204411.11891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204411.11925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204411.11954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204411.11979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204411.11995: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204411.12036: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204411.12045: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.12052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.12154: Set connection var ansible_connection to ssh 37031 1727204411.12166: Set connection var ansible_shell_type to sh 37031 1727204411.12179: Set connection var ansible_pipelining to False 37031 1727204411.12192: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204411.12202: Set connection var ansible_timeout to 10 37031 1727204411.12211: Set connection var ansible_shell_executable to /bin/sh 37031 1727204411.12254: variable 'ansible_shell_executable' from source: unknown 37031 1727204411.12261: variable 'ansible_connection' from source: unknown 37031 1727204411.12271: variable 'ansible_module_compression' from source: unknown 37031 1727204411.12278: variable 'ansible_shell_type' from source: unknown 37031 1727204411.12285: variable 'ansible_shell_executable' from source: unknown 37031 1727204411.12292: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.12299: variable 'ansible_pipelining' from source: unknown 37031 1727204411.12306: variable 'ansible_timeout' from source: unknown 37031 1727204411.12313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.12486: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204411.12502: variable 'omit' from source: magic vars 37031 1727204411.12511: starting attempt loop 37031 1727204411.12516: running the handler 37031 1727204411.12534: _low_level_execute_command(): starting 37031 1727204411.12554: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204411.13311: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.13320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.13372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.13376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.13385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.13426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.13432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.13494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.15167: stdout chunk (state=3): >>>/root <<< 37031 1727204411.15274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.15325: stderr chunk (state=3): >>><<< 37031 1727204411.15329: stdout chunk (state=3): >>><<< 37031 1727204411.15354: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.15376: _low_level_execute_command(): starting 37031 1727204411.15381: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285 `" && echo ansible-tmp-1727204411.1535318-39291-122646310014285="` echo /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285 `" ) && sleep 0' 37031 1727204411.16675: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.16682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.16707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.16718: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204411.16723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.16735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204411.16743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204411.16750: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204411.16755: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.16768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.16779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.16786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.16791: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204411.16798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.16851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.16867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.16877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.16931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.18824: stdout chunk (state=3): >>>ansible-tmp-1727204411.1535318-39291-122646310014285=/root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285 <<< 37031 1727204411.18931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.19010: stderr chunk (state=3): >>><<< 37031 1727204411.19013: stdout chunk (state=3): >>><<< 37031 1727204411.19025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204411.1535318-39291-122646310014285=/root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.19052: variable 'ansible_module_compression' from source: unknown 37031 1727204411.19336: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204411.19339: variable 'ansible_facts' from source: unknown 37031 1727204411.19342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/AnsiballZ_command.py 37031 1727204411.19383: Sending initial data 37031 1727204411.19387: Sent initial data (156 bytes) 37031 1727204411.20343: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.20347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.20383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204411.20386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.20389: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.20391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.20442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.20445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.20491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.22210: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204411.22248: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204411.22291: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpdrni2wnk /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/AnsiballZ_command.py <<< 37031 1727204411.22319: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204411.23279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.23369: stderr chunk (state=3): >>><<< 37031 1727204411.23373: stdout chunk (state=3): >>><<< 37031 1727204411.23393: done transferring module to remote 37031 1727204411.23403: _low_level_execute_command(): starting 37031 1727204411.23408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/ /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/AnsiballZ_command.py && sleep 0' 37031 1727204411.23841: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.23847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.23880: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204411.23884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.23894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204411.23899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.23915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204411.23926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.23991: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.23994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.24000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.24039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.25734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.25781: stderr chunk (state=3): >>><<< 37031 1727204411.25784: stdout chunk (state=3): >>><<< 37031 1727204411.25800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.25804: _low_level_execute_command(): starting 37031 1727204411.25806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/AnsiballZ_command.py && sleep 0' 37031 1727204411.26250: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.26254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.26292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.26296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.26298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.26345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.26348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.26405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.39896: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:00:11.393491", "end": "2024-09-24 15:00:11.398143", "delta": "0:00:00.004652", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204411.41124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204411.41135: stderr chunk (state=3): >>><<< 37031 1727204411.41138: stdout chunk (state=3): >>><<< 37031 1727204411.41175: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-24 15:00:11.393491", "end": "2024-09-24 15:00:11.398143", "delta": "0:00:00.004652", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204411.41231: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204411.41236: _low_level_execute_command(): starting 37031 1727204411.41238: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204411.1535318-39291-122646310014285/ > /dev/null 2>&1 && sleep 0' 37031 1727204411.41806: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204411.41816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.41826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.41841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.41881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.41895: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204411.41898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.41912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204411.41919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204411.41926: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204411.41934: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.41944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.41956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.41962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.41973: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204411.41981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.42053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.42073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.42085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.42154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.44041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.44045: stderr chunk (state=3): >>><<< 37031 1727204411.44055: stdout chunk (state=3): >>><<< 37031 1727204411.44074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.44080: handler run complete 37031 1727204411.44107: Evaluated conditional (False): False 37031 1727204411.44117: attempt loop complete, returning result 37031 1727204411.44120: _execute() done 37031 1727204411.44122: dumping result to json 37031 1727204411.44128: done dumping result, returning 37031 1727204411.44136: done running TaskExecutor() for managed-node2/TASK: Clean up namespace [0affcd87-79f5-b754-dfb8-0000000000b4] 37031 1727204411.44141: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b4 37031 1727204411.44346: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b4 37031 1727204411.44349: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.004652", "end": "2024-09-24 15:00:11.398143", "rc": 0, "start": "2024-09-24 15:00:11.393491" } 37031 1727204411.44422: no more pending results, returning what we have 37031 1727204411.44427: results queue empty 37031 1727204411.44428: checking for any_errors_fatal 37031 1727204411.44433: done checking for any_errors_fatal 37031 1727204411.44434: checking for max_fail_percentage 37031 1727204411.44437: done checking for max_fail_percentage 37031 1727204411.44439: checking to see if all hosts have failed and the running result is not ok 37031 1727204411.44440: done checking to see if all hosts have failed 37031 1727204411.44440: getting the remaining hosts for this loop 37031 1727204411.44442: done getting the remaining hosts for this loop 37031 1727204411.44446: getting the next task for host managed-node2 37031 1727204411.44452: done getting next task for host managed-node2 37031 1727204411.44456: ^ task is: TASK: Verify network state restored to default 37031 1727204411.44460: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204411.44473: getting variables 37031 1727204411.44475: in VariableManager get_vars() 37031 1727204411.44518: Calling all_inventory to load vars for managed-node2 37031 1727204411.44521: Calling groups_inventory to load vars for managed-node2 37031 1727204411.44523: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204411.44533: Calling all_plugins_play to load vars for managed-node2 37031 1727204411.44535: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204411.44538: Calling groups_plugins_play to load vars for managed-node2 37031 1727204411.46453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.48432: done with get_vars() 37031 1727204411.48467: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Tuesday 24 September 2024 15:00:11 -0400 (0:00:00.384) 0:00:34.030 ***** 37031 1727204411.48578: entering _queue_task() for managed-node2/include_tasks 37031 1727204411.48955: worker is 1 (out of 1 available) 37031 1727204411.48974: exiting _queue_task() for managed-node2/include_tasks 37031 1727204411.48987: done queuing things up, now waiting for results queue to drain 37031 1727204411.48988: waiting for pending results... 37031 1727204411.49319: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 37031 1727204411.49452: in run() - task 0affcd87-79f5-b754-dfb8-0000000000b5 37031 1727204411.49485: variable 'ansible_search_path' from source: unknown 37031 1727204411.49528: calling self._execute() 37031 1727204411.49643: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.49660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.49677: variable 'omit' from source: magic vars 37031 1727204411.50191: variable 'ansible_distribution_major_version' from source: facts 37031 1727204411.50212: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204411.50222: _execute() done 37031 1727204411.50234: dumping result to json 37031 1727204411.50242: done dumping result, returning 37031 1727204411.50250: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [0affcd87-79f5-b754-dfb8-0000000000b5] 37031 1727204411.50267: sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b5 37031 1727204411.50408: no more pending results, returning what we have 37031 1727204411.50413: in VariableManager get_vars() 37031 1727204411.50472: Calling all_inventory to load vars for managed-node2 37031 1727204411.50475: Calling groups_inventory to load vars for managed-node2 37031 1727204411.50477: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204411.50492: Calling all_plugins_play to load vars for managed-node2 37031 1727204411.50494: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204411.50498: Calling groups_plugins_play to load vars for managed-node2 37031 1727204411.51649: done sending task result for task 0affcd87-79f5-b754-dfb8-0000000000b5 37031 1727204411.51653: WORKER PROCESS EXITING 37031 1727204411.52555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.54387: done with get_vars() 37031 1727204411.54412: variable 'ansible_search_path' from source: unknown 37031 1727204411.54429: we have included files to process 37031 1727204411.54430: generating all_blocks data 37031 1727204411.54432: done generating all_blocks data 37031 1727204411.54439: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 37031 1727204411.54440: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 37031 1727204411.54443: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 37031 1727204411.54930: done processing included file 37031 1727204411.54933: iterating over new_blocks loaded from include file 37031 1727204411.54934: in VariableManager get_vars() 37031 1727204411.54955: done with get_vars() 37031 1727204411.54959: filtering new block on tags 37031 1727204411.54982: done filtering new block on tags 37031 1727204411.54985: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 37031 1727204411.54990: extending task lists for all hosts with included blocks 37031 1727204411.58045: done extending task lists 37031 1727204411.58048: done processing included files 37031 1727204411.58049: results queue empty 37031 1727204411.58049: checking for any_errors_fatal 37031 1727204411.58054: done checking for any_errors_fatal 37031 1727204411.58055: checking for max_fail_percentage 37031 1727204411.58058: done checking for max_fail_percentage 37031 1727204411.58059: checking to see if all hosts have failed and the running result is not ok 37031 1727204411.58060: done checking to see if all hosts have failed 37031 1727204411.58065: getting the remaining hosts for this loop 37031 1727204411.58067: done getting the remaining hosts for this loop 37031 1727204411.58070: getting the next task for host managed-node2 37031 1727204411.58078: done getting next task for host managed-node2 37031 1727204411.58081: ^ task is: TASK: Check routes and DNS 37031 1727204411.58084: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204411.58087: getting variables 37031 1727204411.58088: in VariableManager get_vars() 37031 1727204411.58108: Calling all_inventory to load vars for managed-node2 37031 1727204411.58111: Calling groups_inventory to load vars for managed-node2 37031 1727204411.58113: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204411.58120: Calling all_plugins_play to load vars for managed-node2 37031 1727204411.58123: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204411.58126: Calling groups_plugins_play to load vars for managed-node2 37031 1727204411.59521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.61409: done with get_vars() 37031 1727204411.61441: done getting variables 37031 1727204411.61494: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:00:11 -0400 (0:00:00.129) 0:00:34.160 ***** 37031 1727204411.61524: entering _queue_task() for managed-node2/shell 37031 1727204411.61919: worker is 1 (out of 1 available) 37031 1727204411.61932: exiting _queue_task() for managed-node2/shell 37031 1727204411.61945: done queuing things up, now waiting for results queue to drain 37031 1727204411.61946: waiting for pending results... 37031 1727204411.62273: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 37031 1727204411.62405: in run() - task 0affcd87-79f5-b754-dfb8-00000000075e 37031 1727204411.62431: variable 'ansible_search_path' from source: unknown 37031 1727204411.62440: variable 'ansible_search_path' from source: unknown 37031 1727204411.62485: calling self._execute() 37031 1727204411.62601: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.62620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.62638: variable 'omit' from source: magic vars 37031 1727204411.63090: variable 'ansible_distribution_major_version' from source: facts 37031 1727204411.63101: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204411.63106: variable 'omit' from source: magic vars 37031 1727204411.63135: variable 'omit' from source: magic vars 37031 1727204411.63165: variable 'omit' from source: magic vars 37031 1727204411.63211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204411.63239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204411.63262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204411.63276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204411.63289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204411.63313: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204411.63316: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.63319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.63390: Set connection var ansible_connection to ssh 37031 1727204411.63393: Set connection var ansible_shell_type to sh 37031 1727204411.63400: Set connection var ansible_pipelining to False 37031 1727204411.63408: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204411.63413: Set connection var ansible_timeout to 10 37031 1727204411.63418: Set connection var ansible_shell_executable to /bin/sh 37031 1727204411.63437: variable 'ansible_shell_executable' from source: unknown 37031 1727204411.63440: variable 'ansible_connection' from source: unknown 37031 1727204411.63443: variable 'ansible_module_compression' from source: unknown 37031 1727204411.63445: variable 'ansible_shell_type' from source: unknown 37031 1727204411.63447: variable 'ansible_shell_executable' from source: unknown 37031 1727204411.63449: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204411.63453: variable 'ansible_pipelining' from source: unknown 37031 1727204411.63459: variable 'ansible_timeout' from source: unknown 37031 1727204411.63461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204411.63567: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204411.63576: variable 'omit' from source: magic vars 37031 1727204411.63581: starting attempt loop 37031 1727204411.63585: running the handler 37031 1727204411.63595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204411.63611: _low_level_execute_command(): starting 37031 1727204411.63619: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204411.64164: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.64187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.64202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.64215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.64270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.64278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.64328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.65910: stdout chunk (state=3): >>>/root <<< 37031 1727204411.66040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.66191: stderr chunk (state=3): >>><<< 37031 1727204411.66194: stdout chunk (state=3): >>><<< 37031 1727204411.66275: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.66279: _low_level_execute_command(): starting 37031 1727204411.66283: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182 `" && echo ansible-tmp-1727204411.662169-39318-124195605555182="` echo /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182 `" ) && sleep 0' 37031 1727204411.66904: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204411.66919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.66933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.66968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.67010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.67024: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204411.67038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.67059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204411.67074: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204411.67087: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204411.67099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.67110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.67124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.67136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.67148: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204411.67192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.67259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.67282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.67286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.67340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.69188: stdout chunk (state=3): >>>ansible-tmp-1727204411.662169-39318-124195605555182=/root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182 <<< 37031 1727204411.69372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.69398: stderr chunk (state=3): >>><<< 37031 1727204411.69401: stdout chunk (state=3): >>><<< 37031 1727204411.69423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204411.662169-39318-124195605555182=/root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.69462: variable 'ansible_module_compression' from source: unknown 37031 1727204411.69518: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204411.69559: variable 'ansible_facts' from source: unknown 37031 1727204411.69642: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/AnsiballZ_command.py 37031 1727204411.70339: Sending initial data 37031 1727204411.70343: Sent initial data (155 bytes) 37031 1727204411.71451: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204411.71483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.71501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.71521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.71568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.71588: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204411.71608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.71626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204411.71640: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204411.71651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204411.71669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.71686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.71713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.71728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.71741: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204411.71756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.71844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.71874: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.71889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.71958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.73667: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204411.73705: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204411.73744: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpqetxugz_ /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/AnsiballZ_command.py <<< 37031 1727204411.73782: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204411.74846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.74999: stderr chunk (state=3): >>><<< 37031 1727204411.75002: stdout chunk (state=3): >>><<< 37031 1727204411.75025: done transferring module to remote 37031 1727204411.75035: _low_level_execute_command(): starting 37031 1727204411.75040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/ /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/AnsiballZ_command.py && sleep 0' 37031 1727204411.75495: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.75501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.75540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.75544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.75550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.75563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204411.75572: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.75630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.75634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.75684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.77414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.77462: stderr chunk (state=3): >>><<< 37031 1727204411.77468: stdout chunk (state=3): >>><<< 37031 1727204411.77484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.77487: _low_level_execute_command(): starting 37031 1727204411.77493: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/AnsiballZ_command.py && sleep 0' 37031 1727204411.77939: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.77944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.77994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204411.77998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204411.78001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found <<< 37031 1727204411.78004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.78052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.78056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.78069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.78127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.92011: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3145sec preferred_lft 3145sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:00:11.910859", "end": "2024-09-24 15:00:11.919006", "delta": "0:00:00.008147", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204411.93144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204411.93201: stderr chunk (state=3): >>><<< 37031 1727204411.93205: stdout chunk (state=3): >>><<< 37031 1727204411.93224: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3145sec preferred_lft 3145sec\n inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:00:11.910859", "end": "2024-09-24 15:00:11.919006", "delta": "0:00:00.008147", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204411.93265: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204411.93272: _low_level_execute_command(): starting 37031 1727204411.93277: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204411.662169-39318-124195605555182/ > /dev/null 2>&1 && sleep 0' 37031 1727204411.93737: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.93741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204411.93797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found <<< 37031 1727204411.93801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 37031 1727204411.93803: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204411.93805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204411.93860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204411.93869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204411.93879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204411.93907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204411.95667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204411.95762: stderr chunk (state=3): >>><<< 37031 1727204411.95767: stdout chunk (state=3): >>><<< 37031 1727204411.95789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204411.95798: handler run complete 37031 1727204411.95837: Evaluated conditional (False): False 37031 1727204411.95852: attempt loop complete, returning result 37031 1727204411.95855: _execute() done 37031 1727204411.95860: dumping result to json 37031 1727204411.95862: done dumping result, returning 37031 1727204411.95873: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [0affcd87-79f5-b754-dfb8-00000000075e] 37031 1727204411.95878: sending task result for task 0affcd87-79f5-b754-dfb8-00000000075e ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008147", "end": "2024-09-24 15:00:11.919006", "rc": 0, "start": "2024-09-24 15:00:11.910859" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:ff:f5:f2:b9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.13.78/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3145sec preferred_lft 3145sec inet6 fe80::8ff:ffff:fef5:f2b9/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.13.78 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.13.78 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 37031 1727204411.96082: no more pending results, returning what we have 37031 1727204411.96088: results queue empty 37031 1727204411.96089: checking for any_errors_fatal 37031 1727204411.96091: done checking for any_errors_fatal 37031 1727204411.96092: checking for max_fail_percentage 37031 1727204411.96094: done checking for max_fail_percentage 37031 1727204411.96095: checking to see if all hosts have failed and the running result is not ok 37031 1727204411.96096: done checking to see if all hosts have failed 37031 1727204411.96097: getting the remaining hosts for this loop 37031 1727204411.96099: done getting the remaining hosts for this loop 37031 1727204411.96103: getting the next task for host managed-node2 37031 1727204411.96110: done getting next task for host managed-node2 37031 1727204411.96113: ^ task is: TASK: Verify DNS and network connectivity 37031 1727204411.96116: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 37031 1727204411.96121: getting variables 37031 1727204411.96123: in VariableManager get_vars() 37031 1727204411.96170: Calling all_inventory to load vars for managed-node2 37031 1727204411.96173: Calling groups_inventory to load vars for managed-node2 37031 1727204411.96180: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000075e 37031 1727204411.96194: WORKER PROCESS EXITING 37031 1727204411.96189: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204411.96280: Calling all_plugins_play to load vars for managed-node2 37031 1727204411.96283: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204411.96286: Calling groups_plugins_play to load vars for managed-node2 37031 1727204411.97996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204411.99811: done with get_vars() 37031 1727204411.99842: done getting variables 37031 1727204411.99915: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:00:11 -0400 (0:00:00.384) 0:00:34.544 ***** 37031 1727204411.99949: entering _queue_task() for managed-node2/shell 37031 1727204412.00339: worker is 1 (out of 1 available) 37031 1727204412.00354: exiting _queue_task() for managed-node2/shell 37031 1727204412.00369: done queuing things up, now waiting for results queue to drain 37031 1727204412.00370: waiting for pending results... 37031 1727204412.00691: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 37031 1727204412.00794: in run() - task 0affcd87-79f5-b754-dfb8-00000000075f 37031 1727204412.00808: variable 'ansible_search_path' from source: unknown 37031 1727204412.00813: variable 'ansible_search_path' from source: unknown 37031 1727204412.00861: calling self._execute() 37031 1727204412.00966: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204412.00975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204412.00990: variable 'omit' from source: magic vars 37031 1727204412.01396: variable 'ansible_distribution_major_version' from source: facts 37031 1727204412.01413: Evaluated conditional (ansible_distribution_major_version != '6'): True 37031 1727204412.01563: variable 'ansible_facts' from source: unknown 37031 1727204412.02438: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 37031 1727204412.02443: variable 'omit' from source: magic vars 37031 1727204412.02501: variable 'omit' from source: magic vars 37031 1727204412.02536: variable 'omit' from source: magic vars 37031 1727204412.02597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 37031 1727204412.02637: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 37031 1727204412.02662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 37031 1727204412.02686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204412.02708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 37031 1727204412.02740: variable 'inventory_hostname' from source: host vars for 'managed-node2' 37031 1727204412.02743: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204412.02746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204412.03033: Set connection var ansible_connection to ssh 37031 1727204412.03038: Set connection var ansible_shell_type to sh 37031 1727204412.03041: Set connection var ansible_pipelining to False 37031 1727204412.03044: Set connection var ansible_module_compression to ZIP_DEFLATED 37031 1727204412.03047: Set connection var ansible_timeout to 10 37031 1727204412.03049: Set connection var ansible_shell_executable to /bin/sh 37031 1727204412.03052: variable 'ansible_shell_executable' from source: unknown 37031 1727204412.03055: variable 'ansible_connection' from source: unknown 37031 1727204412.03060: variable 'ansible_module_compression' from source: unknown 37031 1727204412.03063: variable 'ansible_shell_type' from source: unknown 37031 1727204412.03067: variable 'ansible_shell_executable' from source: unknown 37031 1727204412.03070: variable 'ansible_host' from source: host vars for 'managed-node2' 37031 1727204412.03072: variable 'ansible_pipelining' from source: unknown 37031 1727204412.03075: variable 'ansible_timeout' from source: unknown 37031 1727204412.03078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 37031 1727204412.03228: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204412.03232: variable 'omit' from source: magic vars 37031 1727204412.03234: starting attempt loop 37031 1727204412.03238: running the handler 37031 1727204412.03240: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 37031 1727204412.03245: _low_level_execute_command(): starting 37031 1727204412.03255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 37031 1727204412.04143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.04155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.04168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.04189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.04229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.04242: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.04253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.04267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.04282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.04290: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.04299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.04314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.04325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.04332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.04341: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.04359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.04434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.04459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.04477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.04582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.06100: stdout chunk (state=3): >>>/root <<< 37031 1727204412.06282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204412.06306: stderr chunk (state=3): >>><<< 37031 1727204412.06310: stdout chunk (state=3): >>><<< 37031 1727204412.06345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204412.06361: _low_level_execute_command(): starting 37031 1727204412.06367: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813 `" && echo ansible-tmp-1727204412.0634432-39341-132501030720813="` echo /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813 `" ) && sleep 0' 37031 1727204412.07194: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.07468: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.07476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.07479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.07482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.07484: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.07486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.07488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.07490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.07492: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.07494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.07496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.07497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.07499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.07501: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.07503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.07505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.07507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.07509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.07680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.09466: stdout chunk (state=3): >>>ansible-tmp-1727204412.0634432-39341-132501030720813=/root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813 <<< 37031 1727204412.09585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204412.09681: stderr chunk (state=3): >>><<< 37031 1727204412.09684: stdout chunk (state=3): >>><<< 37031 1727204412.09707: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204412.0634432-39341-132501030720813=/root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204412.09743: variable 'ansible_module_compression' from source: unknown 37031 1727204412.09808: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-37031mdn2lq2k/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 37031 1727204412.09844: variable 'ansible_facts' from source: unknown 37031 1727204412.09930: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/AnsiballZ_command.py 37031 1727204412.10522: Sending initial data 37031 1727204412.10526: Sent initial data (156 bytes) 37031 1727204412.13745: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.13817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.13829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.13847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.13891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.13954: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.13966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.13983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.14026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.14033: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.14042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.14054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.14072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.14081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.14086: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.14096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.14282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.14370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.14417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.14520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.16248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 37031 1727204412.16312: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 37031 1727204412.16316: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-37031mdn2lq2k/tmpw2lc_k34 /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/AnsiballZ_command.py <<< 37031 1727204412.16351: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 37031 1727204412.17566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204412.17840: stderr chunk (state=3): >>><<< 37031 1727204412.17843: stdout chunk (state=3): >>><<< 37031 1727204412.17846: done transferring module to remote 37031 1727204412.17848: _low_level_execute_command(): starting 37031 1727204412.17850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/ /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/AnsiballZ_command.py && sleep 0' 37031 1727204412.19394: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.19414: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.19424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.19437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.19549: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.19559: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.19569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.19585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.19593: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.19599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.19607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.19627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.19643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.19650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.19659: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.19679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.19868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.19876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.19879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.19996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.21824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204412.21828: stdout chunk (state=3): >>><<< 37031 1727204412.21836: stderr chunk (state=3): >>><<< 37031 1727204412.21855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204412.21861: _low_level_execute_command(): starting 37031 1727204412.21866: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/AnsiballZ_command.py && sleep 0' 37031 1727204412.22771: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.22775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.22777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.22780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.22782: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.22784: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.22790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.22796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.22799: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.22801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.22802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.22804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.22806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.22808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.22809: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.22811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.22813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.22815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.22820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.23156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.75853: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2178 0 --:--:-- --:--:-- --:--:-- 2194\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1281 0 --:--:-- --:--:-- --:--:-- 1287", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:00:12.361034", "end": "2024-09-24 15:00:12.757433", "delta": "0:00:00.396399", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 37031 1727204412.77175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. <<< 37031 1727204412.77251: stderr chunk (state=3): >>><<< 37031 1727204412.77256: stdout chunk (state=3): >>><<< 37031 1727204412.77285: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 2178 0 --:--:-- --:--:-- --:--:-- 2194\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1281 0 --:--:-- --:--:-- --:--:-- 1287", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 15:00:12.361034", "end": "2024-09-24 15:00:12.757433", "delta": "0:00:00.396399", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.78 closed. 37031 1727204412.77330: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 37031 1727204412.77338: _low_level_execute_command(): starting 37031 1727204412.77343: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204412.0634432-39341-132501030720813/ > /dev/null 2>&1 && sleep 0' 37031 1727204412.78011: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 37031 1727204412.78021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.78030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.78045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.78089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.78096: stderr chunk (state=3): >>>debug2: match not found <<< 37031 1727204412.78105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.78118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 37031 1727204412.78126: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.13.78 is address <<< 37031 1727204412.78133: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 37031 1727204412.78140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 37031 1727204412.78149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 37031 1727204412.78161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 37031 1727204412.78176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 <<< 37031 1727204412.78183: stderr chunk (state=3): >>>debug2: match found <<< 37031 1727204412.78193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 37031 1727204412.78282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 37031 1727204412.78286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 37031 1727204412.78293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 37031 1727204412.78397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 37031 1727204412.80184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 37031 1727204412.80268: stderr chunk (state=3): >>><<< 37031 1727204412.80272: stdout chunk (state=3): >>><<< 37031 1727204412.80279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.78 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.78 originally 10.31.13.78 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 37031 1727204412.80296: handler run complete 37031 1727204412.80314: Evaluated conditional (False): False 37031 1727204412.80324: attempt loop complete, returning result 37031 1727204412.80327: _execute() done 37031 1727204412.80330: dumping result to json 37031 1727204412.80336: done dumping result, returning 37031 1727204412.80345: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [0affcd87-79f5-b754-dfb8-00000000075f] 37031 1727204412.80349: sending task result for task 0affcd87-79f5-b754-dfb8-00000000075f 37031 1727204412.80467: done sending task result for task 0affcd87-79f5-b754-dfb8-00000000075f 37031 1727204412.80470: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.396399", "end": "2024-09-24 15:00:12.757433", "rc": 0, "start": "2024-09-24 15:00:12.361034" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 2178 0 --:--:-- --:--:-- --:--:-- 2194 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1281 0 --:--:-- --:--:-- --:--:-- 1287 37031 1727204412.80539: no more pending results, returning what we have 37031 1727204412.80543: results queue empty 37031 1727204412.80544: checking for any_errors_fatal 37031 1727204412.80556: done checking for any_errors_fatal 37031 1727204412.80557: checking for max_fail_percentage 37031 1727204412.80559: done checking for max_fail_percentage 37031 1727204412.80561: checking to see if all hosts have failed and the running result is not ok 37031 1727204412.80562: done checking to see if all hosts have failed 37031 1727204412.80562: getting the remaining hosts for this loop 37031 1727204412.80566: done getting the remaining hosts for this loop 37031 1727204412.80571: getting the next task for host managed-node2 37031 1727204412.80580: done getting next task for host managed-node2 37031 1727204412.80582: ^ task is: TASK: meta (flush_handlers) 37031 1727204412.80584: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204412.80590: getting variables 37031 1727204412.80592: in VariableManager get_vars() 37031 1727204412.80634: Calling all_inventory to load vars for managed-node2 37031 1727204412.80637: Calling groups_inventory to load vars for managed-node2 37031 1727204412.80639: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204412.80648: Calling all_plugins_play to load vars for managed-node2 37031 1727204412.80650: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204412.80653: Calling groups_plugins_play to load vars for managed-node2 37031 1727204412.82558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204412.84251: done with get_vars() 37031 1727204412.84287: done getting variables 37031 1727204412.84375: in VariableManager get_vars() 37031 1727204412.84391: Calling all_inventory to load vars for managed-node2 37031 1727204412.84394: Calling groups_inventory to load vars for managed-node2 37031 1727204412.84396: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204412.84401: Calling all_plugins_play to load vars for managed-node2 37031 1727204412.84404: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204412.84407: Calling groups_plugins_play to load vars for managed-node2 37031 1727204412.85655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204412.87340: done with get_vars() 37031 1727204412.87382: done queuing things up, now waiting for results queue to drain 37031 1727204412.87384: results queue empty 37031 1727204412.87385: checking for any_errors_fatal 37031 1727204412.87389: done checking for any_errors_fatal 37031 1727204412.87390: checking for max_fail_percentage 37031 1727204412.87391: done checking for max_fail_percentage 37031 1727204412.87392: checking to see if all hosts have failed and the running result is not ok 37031 1727204412.87393: done checking to see if all hosts have failed 37031 1727204412.87394: getting the remaining hosts for this loop 37031 1727204412.87395: done getting the remaining hosts for this loop 37031 1727204412.87398: getting the next task for host managed-node2 37031 1727204412.87402: done getting next task for host managed-node2 37031 1727204412.87404: ^ task is: TASK: meta (flush_handlers) 37031 1727204412.87405: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204412.87413: getting variables 37031 1727204412.87414: in VariableManager get_vars() 37031 1727204412.87431: Calling all_inventory to load vars for managed-node2 37031 1727204412.87434: Calling groups_inventory to load vars for managed-node2 37031 1727204412.87436: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204412.87442: Calling all_plugins_play to load vars for managed-node2 37031 1727204412.87444: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204412.87447: Calling groups_plugins_play to load vars for managed-node2 37031 1727204412.88750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204412.90387: done with get_vars() 37031 1727204412.90413: done getting variables 37031 1727204412.90478: in VariableManager get_vars() 37031 1727204412.90495: Calling all_inventory to load vars for managed-node2 37031 1727204412.90497: Calling groups_inventory to load vars for managed-node2 37031 1727204412.90499: Calling all_plugins_inventory to load vars for managed-node2 37031 1727204412.90504: Calling all_plugins_play to load vars for managed-node2 37031 1727204412.90507: Calling groups_plugins_inventory to load vars for managed-node2 37031 1727204412.90510: Calling groups_plugins_play to load vars for managed-node2 37031 1727204412.91754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 37031 1727204412.93603: done with get_vars() 37031 1727204412.93632: done queuing things up, now waiting for results queue to drain 37031 1727204412.93636: results queue empty 37031 1727204412.93637: checking for any_errors_fatal 37031 1727204412.93639: done checking for any_errors_fatal 37031 1727204412.93639: checking for max_fail_percentage 37031 1727204412.93641: done checking for max_fail_percentage 37031 1727204412.93641: checking to see if all hosts have failed and the running result is not ok 37031 1727204412.93642: done checking to see if all hosts have failed 37031 1727204412.93643: getting the remaining hosts for this loop 37031 1727204412.93644: done getting the remaining hosts for this loop 37031 1727204412.93647: getting the next task for host managed-node2 37031 1727204412.93650: done getting next task for host managed-node2 37031 1727204412.93651: ^ task is: None 37031 1727204412.93653: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 37031 1727204412.93654: done queuing things up, now waiting for results queue to drain 37031 1727204412.93655: results queue empty 37031 1727204412.93655: checking for any_errors_fatal 37031 1727204412.93656: done checking for any_errors_fatal 37031 1727204412.93657: checking for max_fail_percentage 37031 1727204412.93658: done checking for max_fail_percentage 37031 1727204412.93658: checking to see if all hosts have failed and the running result is not ok 37031 1727204412.93659: done checking to see if all hosts have failed 37031 1727204412.93662: getting the next task for host managed-node2 37031 1727204412.93667: done getting next task for host managed-node2 37031 1727204412.93667: ^ task is: None 37031 1727204412.93669: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=75 changed=2 unreachable=0 failed=0 skipped=63 rescued=0 ignored=0 Tuesday 24 September 2024 15:00:12 -0400 (0:00:00.938) 0:00:35.482 ***** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 2.89s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 1.80s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.73s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Install iproute --------------------------------------------------------- 1.64s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which services are running ---- 1.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Ensure ping6 command is present ----------------------------------------- 1.38s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Install iproute --------------------------------------------------------- 1.30s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Create veth interface veth0 --------------------------------------------- 1.13s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.12s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Verify DNS and network connectivity ------------------------------------- 0.94s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Check which packages are installed --- 0.86s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.86s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.69s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.62s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Get NM profile info ----------------------------------------------------- 0.59s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Check if system is ostree ----------------------------------------------- 0.57s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Stat profile file ------------------------------------------------------- 0.54s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Gather current interface info ------------------------------------------- 0.53s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 37031 1727204412.93809: RUNNING CLEANUP