[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 25052 1726882462.55061: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 25052 1726882462.55489: Added group all to inventory 25052 1726882462.55491: Added group ungrouped to inventory 25052 1726882462.55497: Group all now contains ungrouped 25052 1726882462.55500: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 25052 1726882462.75874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 25052 1726882462.76043: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 25052 1726882462.76068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 25052 1726882462.76234: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 25052 1726882462.76312: Loaded config def from plugin (inventory/script) 25052 1726882462.76315: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 25052 1726882462.76357: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 25052 1726882462.76653: Loaded config def from plugin (inventory/yaml) 25052 1726882462.76655: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 25052 1726882462.76749: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 25052 1726882462.77785: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 25052 1726882462.77788: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 25052 1726882462.77795: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 25052 1726882462.77803: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 25052 1726882462.77809: Loading data from /tmp/network-Kc3/inventory.yml 25052 1726882462.77878: /tmp/network-Kc3/inventory.yml was not parsable by auto 25052 1726882462.77947: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 25052 1726882462.77988: Loading data from /tmp/network-Kc3/inventory.yml 25052 1726882462.78279: group all already in inventory 25052 1726882462.78287: set inventory_file for managed_node1 25052 1726882462.78295: set inventory_dir for managed_node1 25052 1726882462.78296: Added host managed_node1 to inventory 25052 1726882462.78299: Added host managed_node1 to group all 25052 1726882462.78300: set ansible_host for managed_node1 25052 1726882462.78301: set ansible_ssh_extra_args for managed_node1 25052 1726882462.78305: set inventory_file for managed_node2 25052 1726882462.78308: set inventory_dir for managed_node2 25052 1726882462.78309: Added host managed_node2 to inventory 25052 1726882462.78310: Added host managed_node2 to group all 25052 1726882462.78311: set ansible_host for managed_node2 25052 1726882462.78312: set ansible_ssh_extra_args for managed_node2 25052 1726882462.78314: set inventory_file for managed_node3 25052 1726882462.78317: set inventory_dir for managed_node3 25052 1726882462.78318: Added host managed_node3 to inventory 25052 1726882462.78319: Added host managed_node3 to group all 25052 1726882462.78320: set ansible_host for managed_node3 25052 1726882462.78320: set ansible_ssh_extra_args for managed_node3 25052 1726882462.78323: Reconcile groups and hosts in inventory. 25052 1726882462.78327: Group ungrouped now contains managed_node1 25052 1726882462.78329: Group ungrouped now contains managed_node2 25052 1726882462.78330: Group ungrouped now contains managed_node3 25052 1726882462.78614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 25052 1726882462.78744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 25052 1726882462.78997: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 25052 1726882462.79024: Loaded config def from plugin (vars/host_group_vars) 25052 1726882462.79026: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 25052 1726882462.79032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 25052 1726882462.79039: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 25052 1726882462.79074: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 25052 1726882462.79686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882462.79883: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 25052 1726882462.79926: Loaded config def from plugin (connection/local) 25052 1726882462.79929: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 25052 1726882462.81446: Loaded config def from plugin (connection/paramiko_ssh) 25052 1726882462.81449: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 25052 1726882462.82446: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25052 1726882462.82507: Loaded config def from plugin (connection/psrp) 25052 1726882462.82510: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 25052 1726882462.83283: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25052 1726882462.83331: Loaded config def from plugin (connection/ssh) 25052 1726882462.83333: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 25052 1726882462.86150: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 25052 1726882462.86307: Loaded config def from plugin (connection/winrm) 25052 1726882462.86310: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 25052 1726882462.86337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 25052 1726882462.86395: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 25052 1726882462.86570: Loaded config def from plugin (shell/cmd) 25052 1726882462.86572: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 25052 1726882462.86701: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 25052 1726882462.86772: Loaded config def from plugin (shell/powershell) 25052 1726882462.86775: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 25052 1726882462.86982: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 25052 1726882462.87167: Loaded config def from plugin (shell/sh) 25052 1726882462.87169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 25052 1726882462.87201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 25052 1726882462.87313: Loaded config def from plugin (become/runas) 25052 1726882462.87316: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 25052 1726882462.87476: Loaded config def from plugin (become/su) 25052 1726882462.87478: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 25052 1726882462.87617: Loaded config def from plugin (become/sudo) 25052 1726882462.87620: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 25052 1726882462.87650: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25052 1726882462.87956: in VariableManager get_vars() 25052 1726882462.87979: done with get_vars() 25052 1726882462.88103: trying /usr/local/lib/python3.12/site-packages/ansible/modules 25052 1726882462.91336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 25052 1726882462.91455: in VariableManager get_vars() 25052 1726882462.91461: done with get_vars() 25052 1726882462.91464: variable 'playbook_dir' from source: magic vars 25052 1726882462.91465: variable 'ansible_playbook_python' from source: magic vars 25052 1726882462.91466: variable 'ansible_config_file' from source: magic vars 25052 1726882462.91466: variable 'groups' from source: magic vars 25052 1726882462.91467: variable 'omit' from source: magic vars 25052 1726882462.91468: variable 'ansible_version' from source: magic vars 25052 1726882462.91469: variable 'ansible_check_mode' from source: magic vars 25052 1726882462.91469: variable 'ansible_diff_mode' from source: magic vars 25052 1726882462.91470: variable 'ansible_forks' from source: magic vars 25052 1726882462.91471: variable 'ansible_inventory_sources' from source: magic vars 25052 1726882462.91472: variable 'ansible_skip_tags' from source: magic vars 25052 1726882462.91473: variable 'ansible_limit' from source: magic vars 25052 1726882462.91473: variable 'ansible_run_tags' from source: magic vars 25052 1726882462.91474: variable 'ansible_verbosity' from source: magic vars 25052 1726882462.91512: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml 25052 1726882462.92119: in VariableManager get_vars() 25052 1726882462.92136: done with get_vars() 25052 1726882462.92200: in VariableManager get_vars() 25052 1726882462.92215: done with get_vars() 25052 1726882462.92765: in VariableManager get_vars() 25052 1726882462.92779: done with get_vars() 25052 1726882462.92784: variable 'omit' from source: magic vars 25052 1726882462.92875: variable 'omit' from source: magic vars 25052 1726882462.92912: in VariableManager get_vars() 25052 1726882462.92923: done with get_vars() 25052 1726882462.92969: in VariableManager get_vars() 25052 1726882462.92982: done with get_vars() 25052 1726882462.93121: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25052 1726882462.93500: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25052 1726882462.93731: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25052 1726882462.94942: in VariableManager get_vars() 25052 1726882462.94959: done with get_vars() 25052 1726882462.95253: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 25052 1726882462.95343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25052 1726882462.96435: in VariableManager get_vars() 25052 1726882462.96455: done with get_vars() 25052 1726882462.96498: in VariableManager get_vars() 25052 1726882462.96535: done with get_vars() 25052 1726882462.97526: in VariableManager get_vars() 25052 1726882462.97542: done with get_vars() 25052 1726882462.97546: variable 'omit' from source: magic vars 25052 1726882462.97557: variable 'omit' from source: magic vars 25052 1726882462.97587: in VariableManager get_vars() 25052 1726882462.97609: done with get_vars() 25052 1726882462.97633: in VariableManager get_vars() 25052 1726882462.97649: done with get_vars() 25052 1726882462.97676: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 25052 1726882462.98121: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 25052 1726882462.98488: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 25052 1726882463.00181: in VariableManager get_vars() 25052 1726882463.00201: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25052 1726882463.02616: in VariableManager get_vars() 25052 1726882463.02631: done with get_vars() 25052 1726882463.02720: in VariableManager get_vars() 25052 1726882463.02733: done with get_vars() 25052 1726882463.02780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 25052 1726882463.02796: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 25052 1726882463.02969: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 25052 1726882463.03064: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 25052 1726882463.03067: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 25052 1726882463.03087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 25052 1726882463.03106: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 25052 1726882463.03205: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 25052 1726882463.03242: Loaded config def from plugin (callback/default) 25052 1726882463.03245: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882463.04267: Loaded config def from plugin (callback/junit) 25052 1726882463.04270: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882463.04317: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 25052 1726882463.04371: Loaded config def from plugin (callback/minimal) 25052 1726882463.04373: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882463.04414: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882463.04471: Loaded config def from plugin (callback/tree) 25052 1726882463.04473: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 25052 1726882463.04581: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 25052 1726882463.04584: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ipv6_nm.yml **************************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml 25052 1726882463.04614: in VariableManager get_vars() 25052 1726882463.04627: done with get_vars() 25052 1726882463.04633: in VariableManager get_vars() 25052 1726882463.04641: done with get_vars() 25052 1726882463.04645: variable 'omit' from source: magic vars 25052 1726882463.04682: in VariableManager get_vars() 25052 1726882463.04699: done with get_vars() 25052 1726882463.04720: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ipv6.yml' with nm as provider] ************* 25052 1726882463.05270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 25052 1726882463.05343: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 25052 1726882463.05397: getting the remaining hosts for this loop 25052 1726882463.05398: done getting the remaining hosts for this loop 25052 1726882463.05400: getting the next task for host managed_node2 25052 1726882463.05402: done getting next task for host managed_node2 25052 1726882463.05404: ^ task is: TASK: Gathering Facts 25052 1726882463.05405: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882463.05406: getting variables 25052 1726882463.05407: in VariableManager get_vars() 25052 1726882463.05416: Calling all_inventory to load vars for managed_node2 25052 1726882463.05418: Calling groups_inventory to load vars for managed_node2 25052 1726882463.05423: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882463.05445: Calling all_plugins_play to load vars for managed_node2 25052 1726882463.05456: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882463.05459: Calling groups_plugins_play to load vars for managed_node2 25052 1726882463.05481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882463.05518: done with get_vars() 25052 1726882463.05523: done getting variables 25052 1726882463.05573: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Friday 20 September 2024 21:34:23 -0400 (0:00:00.010) 0:00:00.010 ****** 25052 1726882463.05587: entering _queue_task() for managed_node2/gather_facts 25052 1726882463.05588: Creating lock for gather_facts 25052 1726882463.05861: worker is 1 (out of 1 available) 25052 1726882463.05871: exiting _queue_task() for managed_node2/gather_facts 25052 1726882463.05882: done queuing things up, now waiting for results queue to drain 25052 1726882463.05884: waiting for pending results... 25052 1726882463.06017: running TaskExecutor() for managed_node2/TASK: Gathering Facts 25052 1726882463.06070: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000b9 25052 1726882463.06079: variable 'ansible_search_path' from source: unknown 25052 1726882463.06110: calling self._execute() 25052 1726882463.06157: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882463.06161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882463.06168: variable 'omit' from source: magic vars 25052 1726882463.06236: variable 'omit' from source: magic vars 25052 1726882463.06259: variable 'omit' from source: magic vars 25052 1726882463.06282: variable 'omit' from source: magic vars 25052 1726882463.06318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882463.06345: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882463.06363: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882463.06376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882463.06386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882463.06412: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882463.06415: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882463.06418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882463.06486: Set connection var ansible_pipelining to False 25052 1726882463.06489: Set connection var ansible_connection to ssh 25052 1726882463.06496: Set connection var ansible_shell_type to sh 25052 1726882463.06499: Set connection var ansible_timeout to 10 25052 1726882463.06505: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882463.06510: Set connection var ansible_shell_executable to /bin/sh 25052 1726882463.06525: variable 'ansible_shell_executable' from source: unknown 25052 1726882463.06528: variable 'ansible_connection' from source: unknown 25052 1726882463.06531: variable 'ansible_module_compression' from source: unknown 25052 1726882463.06533: variable 'ansible_shell_type' from source: unknown 25052 1726882463.06535: variable 'ansible_shell_executable' from source: unknown 25052 1726882463.06538: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882463.06540: variable 'ansible_pipelining' from source: unknown 25052 1726882463.06542: variable 'ansible_timeout' from source: unknown 25052 1726882463.06547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882463.06704: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882463.06713: variable 'omit' from source: magic vars 25052 1726882463.06716: starting attempt loop 25052 1726882463.06718: running the handler 25052 1726882463.06731: variable 'ansible_facts' from source: unknown 25052 1726882463.06745: _low_level_execute_command(): starting 25052 1726882463.06752: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882463.07315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.07375: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882463.07389: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.07459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.07527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.09217: stdout chunk (state=3): >>>/root <<< 25052 1726882463.09312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.09343: stderr chunk (state=3): >>><<< 25052 1726882463.09345: stdout chunk (state=3): >>><<< 25052 1726882463.09417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882463.09420: _low_level_execute_command(): starting 25052 1726882463.09424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250 `" && echo ansible-tmp-1726882463.0935752-25076-71300421287250="` echo /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250 `" ) && sleep 0' 25052 1726882463.09769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882463.09772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.09775: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882463.09777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.09828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882463.09831: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.09903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.11783: stdout chunk (state=3): >>>ansible-tmp-1726882463.0935752-25076-71300421287250=/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250 <<< 25052 1726882463.11960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.11965: stdout chunk (state=3): >>><<< 25052 1726882463.11968: stderr chunk (state=3): >>><<< 25052 1726882463.11972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882463.0935752-25076-71300421287250=/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882463.11999: variable 'ansible_module_compression' from source: unknown 25052 1726882463.12106: ANSIBALLZ: Using generic lock for ansible.legacy.setup 25052 1726882463.12116: ANSIBALLZ: Acquiring lock 25052 1726882463.12119: ANSIBALLZ: Lock acquired: 140207139645744 25052 1726882463.12121: ANSIBALLZ: Creating module 25052 1726882463.46964: ANSIBALLZ: Writing module into payload 25052 1726882463.46968: ANSIBALLZ: Writing module 25052 1726882463.46970: ANSIBALLZ: Renaming module 25052 1726882463.46972: ANSIBALLZ: Done creating module 25052 1726882463.46997: variable 'ansible_facts' from source: unknown 25052 1726882463.47007: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882463.47018: _low_level_execute_command(): starting 25052 1726882463.47025: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 25052 1726882463.47719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882463.47723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882463.47726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882463.47728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882463.47730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882463.47732: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882463.47735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.47737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882463.47739: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882463.47741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882463.47743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882463.47745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882463.47747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882463.47826: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882463.47830: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882463.47832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.47834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882463.47863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882463.47866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.48106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.49808: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 25052 1726882463.50249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.50252: stdout chunk (state=3): >>><<< 25052 1726882463.50259: stderr chunk (state=3): >>><<< 25052 1726882463.50276: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882463.50287 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 25052 1726882463.50336: _low_level_execute_command(): starting 25052 1726882463.50339: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 25052 1726882463.50923: Sending initial data 25052 1726882463.50926: Sent initial data (1181 bytes) 25052 1726882463.51539: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882463.51674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882463.51781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882463.51892: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882463.51913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.52007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.55479: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 25052 1726882463.55802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.55843: stderr chunk (state=3): >>><<< 25052 1726882463.55912: stdout chunk (state=3): >>><<< 25052 1726882463.55933: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882463.56302: variable 'ansible_facts' from source: unknown 25052 1726882463.56305: variable 'ansible_facts' from source: unknown 25052 1726882463.56308: variable 'ansible_module_compression' from source: unknown 25052 1726882463.56311: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25052 1726882463.56313: variable 'ansible_facts' from source: unknown 25052 1726882463.56785: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py 25052 1726882463.57099: Sending initial data 25052 1726882463.57109: Sent initial data (153 bytes) 25052 1726882463.58311: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.58450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.58552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.60115: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882463.60262: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882463.60330: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpisue8p2m /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py <<< 25052 1726882463.60333: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py" <<< 25052 1726882463.60388: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpisue8p2m" to remote "/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py" <<< 25052 1726882463.63178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.63403: stdout chunk (state=3): >>><<< 25052 1726882463.63406: stderr chunk (state=3): >>><<< 25052 1726882463.63409: done transferring module to remote 25052 1726882463.63411: _low_level_execute_command(): starting 25052 1726882463.63413: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/ /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py && sleep 0' 25052 1726882463.64509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.64652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882463.64823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.64880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.66759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882463.66789: stderr chunk (state=3): >>><<< 25052 1726882463.66796: stdout chunk (state=3): >>><<< 25052 1726882463.66900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882463.66903: _low_level_execute_command(): starting 25052 1726882463.66906: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/AnsiballZ_setup.py && sleep 0' 25052 1726882463.68386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882463.68659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882463.68663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882463.68665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882463.70690: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 25052 1726882463.70728: stdout chunk (state=3): >>>import _imp # builtin <<< 25052 1726882463.70731: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 25052 1726882463.70935: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 25052 1726882463.70967: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 25052 1726882463.70984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.70990: stdout chunk (state=3): >>>import '_codecs' # <<< 25052 1726882463.71117: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e93104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e92dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 25052 1726882463.71136: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9312a50> <<< 25052 1726882463.71164: stdout chunk (state=3): >>>import '_signal' # <<< 25052 1726882463.71167: stdout chunk (state=3): >>>import '_abc' # <<< 25052 1726882463.71351: stdout chunk (state=3): >>>import 'abc' # <<< 25052 1726882463.71354: stdout chunk (state=3): >>>import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # <<< 25052 1726882463.71530: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 25052 1726882463.71535: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90c1130> <<< 25052 1726882463.71539: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 25052 1726882463.71566: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.71569: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90c1fa0> <<< 25052 1726882463.71591: stdout chunk (state=3): >>>import 'site' # <<< 25052 1726882463.71617: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25052 1726882463.71996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25052 1726882463.72005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 25052 1726882463.72028: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 25052 1726882463.72054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.72058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 25052 1726882463.72128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 25052 1726882463.72132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 25052 1726882463.72135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 25052 1726882463.72282: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90ffda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 25052 1726882463.72289: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fffb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 25052 1726882463.72334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.72369: stdout chunk (state=3): >>>import 'itertools' # <<< 25052 1726882463.72373: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 25052 1726882463.72411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9137770> <<< 25052 1726882463.72426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 25052 1726882463.72429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25052 1726882463.72432: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9137e00> <<< 25052 1726882463.72434: stdout chunk (state=3): >>>import '_collections' # <<< 25052 1726882463.72544: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9117a40> <<< 25052 1726882463.72551: stdout chunk (state=3): >>>import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9115160> <<< 25052 1726882463.72614: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fcf50> <<< 25052 1726882463.72686: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 25052 1726882463.72691: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25052 1726882463.72735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 25052 1726882463.72794: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91576b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91562d0> <<< 25052 1726882463.72864: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 25052 1726882463.72870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9116030> <<< 25052 1726882463.72873: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9154b60> <<< 25052 1726882463.72903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918c6b0> <<< 25052 1726882463.72924: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fc1d0> <<< 25052 1726882463.73079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e918cb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918ca10> <<< 25052 1726882463.73083: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e918cdd0> <<< 25052 1726882463.73121: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90facf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 25052 1726882463.73124: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918d4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918d190> <<< 25052 1726882463.73144: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 25052 1726882463.73169: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 25052 1726882463.73179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 25052 1726882463.73299: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918e3c0> <<< 25052 1726882463.73303: stdout chunk (state=3): >>>import 'importlib.util' # <<< 25052 1726882463.73578: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91a85c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91a9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 25052 1726882463.73614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 25052 1726882463.73618: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91aaba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91ab200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91aa0f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91abc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91ab3b0> <<< 25052 1726882463.73753: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918e330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 25052 1726882463.73757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 25052 1726882463.73764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8eafbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 25052 1726882463.73817: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed86e0> <<< 25052 1726882463.73850: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25052 1726882463.73964: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.74048: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed9010> <<< 25052 1726882463.74160: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed99d0> <<< 25052 1726882463.74311: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8eadd90> <<< 25052 1726882463.74315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8edad80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed9880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918eae0> <<< 25052 1726882463.74340: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25052 1726882463.74403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.74423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 25052 1726882463.74447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 25052 1726882463.74480: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f070b0> <<< 25052 1726882463.74532: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 25052 1726882463.74621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.74624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 25052 1726882463.74639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f27440> <<< 25052 1726882463.74657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25052 1726882463.74716: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25052 1726882463.74761: stdout chunk (state=3): >>>import 'ntpath' # <<< 25052 1726882463.74816: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 25052 1726882463.74832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f88200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 25052 1726882463.74851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 25052 1726882463.74938: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 25052 1726882463.74973: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f8a960> <<< 25052 1726882463.75047: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f88320> <<< 25052 1726882463.75084: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f55220> <<< 25052 1726882463.75154: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8d95370> <<< 25052 1726882463.75157: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f26240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8edbcb0> <<< 25052 1726882463.75388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff3e8f265a0> <<< 25052 1726882463.75625: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_iwkz6vn1/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 25052 1726882463.75721: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.75744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 25052 1726882463.75757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25052 1726882463.75811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25052 1726882463.76021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dfb110> import '_typing' # <<< 25052 1726882463.76085: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dda000> <<< 25052 1726882463.76112: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dd9160> # zipimport: zlib available <<< 25052 1726882463.76138: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 25052 1726882463.76165: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.76186: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 25052 1726882463.76200: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.77613: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.78709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8df8fe0> <<< 25052 1726882463.78736: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.78771: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 25052 1726882463.78821: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2ab40> <<< 25052 1726882463.78912: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a8d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a1e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 25052 1726882463.78954: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dfbda0> <<< 25052 1726882463.78980: stdout chunk (state=3): >>>import 'atexit' # <<< 25052 1726882463.79197: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2b890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2ba40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2bf50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 25052 1726882463.79200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 25052 1726882463.79229: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8729d90> <<< 25052 1726882463.79260: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.79283: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e872b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 25052 1726882463.79346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872c350> <<< 25052 1726882463.79464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 25052 1726882463.79477: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872d4f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 25052 1726882463.79521: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872ff80> <<< 25052 1726882463.79568: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed82f0> <<< 25052 1726882463.79580: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872e240> <<< 25052 1726882463.79607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25052 1726882463.79637: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25052 1726882463.79808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25052 1726882463.79811: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25052 1726882463.79824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8737da0> import '_tokenize' # <<< 25052 1726882463.79891: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8736870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e87365d0> <<< 25052 1726882463.79918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25052 1726882463.79986: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8736b40> <<< 25052 1726882463.80020: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872e750> <<< 25052 1726882463.80085: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e877bfe0> <<< 25052 1726882463.80099: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 25052 1726882463.80102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877c110> <<< 25052 1726882463.80104: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 25052 1726882463.80187: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 25052 1726882463.80190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.80603: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e877dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 25052 1726882463.80606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 25052 1726882463.80643: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e87801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8783980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8780380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784b90> <<< 25052 1726882463.80675: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877c2c0> <<< 25052 1726882463.80720: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 25052 1726882463.80741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25052 1726882463.80744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 25052 1726882463.80830: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.80836: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.80841: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8610290> <<< 25052 1726882463.80960: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.80966: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8611550> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8786a20> <<< 25052 1726882463.81028: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8787dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8786660> <<< 25052 1726882463.81034: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 25052 1726882463.81121: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.81406: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.81409: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.81506: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.82025: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.82553: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 25052 1726882463.82578: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 25052 1726882463.82613: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.82650: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.82839: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e86156a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86163c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8787530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.82932: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 25052 1726882463.83113: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.83155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 25052 1726882463.83178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86164b0> # zipimport: zlib available <<< 25052 1726882463.83712: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.84064: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.84367: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.84518: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 25052 1726882463.84533: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.84597: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 25052 1726882463.84797: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25052 1726882463.85073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25052 1726882463.85095: stdout chunk (state=3): >>>import '_ast' # <<< 25052 1726882463.85146: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8617680> <<< 25052 1726882463.85242: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85258: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85358: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 25052 1726882463.85383: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85424: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 25052 1726882463.85569: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85575: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.85638: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25052 1726882463.85681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.85821: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8622180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e861d8b0> import 'ansible.module_utils.common.file' # <<< 25052 1726882463.85843: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 25052 1726882463.85895: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85950: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.85972: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86023: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.86135: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 25052 1726882463.86140: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25052 1726882463.86175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25052 1726882463.86178: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25052 1726882463.86230: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e870a960> <<< 25052 1726882463.86318: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e87fe630> <<< 25052 1726882463.86348: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8621ee0> <<< 25052 1726882463.86373: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8784e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 25052 1726882463.86389: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86426: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 25052 1726882463.86553: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 25052 1726882463.86634: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86657: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.86677: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86713: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86761: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86884: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 25052 1726882463.86909: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.86990: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.87008: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.87107: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 25052 1726882463.87220: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.87391: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.87428: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.87530: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882463.87648: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 25052 1726882463.87651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 25052 1726882463.87667: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b62d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 25052 1726882463.87726: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 25052 1726882463.87738: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e4140> <<< 25052 1726882463.87766: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e43b0> <<< 25052 1726882463.87871: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86a31d0> <<< 25052 1726882463.87895: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b6e40> <<< 25052 1726882463.87898: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b4980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b45f0> <<< 25052 1726882463.87901: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 25052 1726882463.87984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 25052 1726882463.87988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 25052 1726882463.88085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 25052 1726882463.88101: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e7470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e6d50> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e6f00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e6180> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 25052 1726882463.88354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e75c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e834a030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e61b0> <<< 25052 1726882463.88392: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b5a60> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 25052 1726882463.88411: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88526: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # <<< 25052 1726882463.88537: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88585: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 25052 1726882463.88670: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 25052 1726882463.88683: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88705: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88745: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 25052 1726882463.88800: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 25052 1726882463.88876: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.88968: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # <<< 25052 1726882463.89008: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.89068: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.89092: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.89122: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.89309: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 25052 1726882463.89676: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.90108: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 25052 1726882463.90190: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.90297: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 25052 1726882463.90720: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.90789: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 25052 1726882463.90829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e834be90> <<< 25052 1726882463.90836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 25052 1726882463.90853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 25052 1726882463.90985: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e834aba0> <<< 25052 1726882463.90988: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 25052 1726882463.91054: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.91120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 25052 1726882463.91268: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.91301: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 25052 1726882463.91323: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.91381: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.91454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 25052 1726882463.91500: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.91546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 25052 1726882463.91595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 25052 1726882463.91656: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882463.91717: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8382240> <<< 25052 1726882463.91896: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8372e70> <<< 25052 1726882463.91923: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 25052 1726882463.92039: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 25052 1726882463.92112: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92191: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92352: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92466: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 25052 1726882463.92488: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 25052 1726882463.92697: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92704: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8395ac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8397440> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 25052 1726882463.92728: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 25052 1726882463.92777: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 25052 1726882463.92827: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.92971: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 25052 1726882463.93228: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93329: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93370: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.93474: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93606: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93806: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 25052 1726882463.93817: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.93881: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.94004: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 25052 1726882463.94054: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.94091: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.94708: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.95118: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 25052 1726882463.95140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 25052 1726882463.95418: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 25052 1726882463.95441: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.95547: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 25052 1726882463.95751: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 25052 1726882463.96078: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96184: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96429: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96576: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 25052 1726882463.96601: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96627: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 25052 1726882463.96708: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.96875: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 25052 1726882463.96900: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 25052 1726882463.96932: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 25052 1726882463.97000: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.97062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 25052 1726882463.97075: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.97117: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.97182: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 25052 1726882463.97442: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.97699: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 25052 1726882463.97728: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.97770: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882463.98247: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 25052 1726882463.98297: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 25052 1726882463.98355: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98373: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98395: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98431: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98486: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98551: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.98863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 25052 1726882463.98933: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99121: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 25052 1726882463.99136: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99181: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 25052 1726882463.99237: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99274: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99323: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 25052 1726882463.99346: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99423: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 25052 1726882463.99527: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99614: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 25052 1726882463.99771: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882463.99942: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 25052 1726882463.99978: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 25052 1726882464.00070: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8192660> <<< 25052 1726882464.00074: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8193410> <<< 25052 1726882464.00084: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e818bfb0> <<< 25052 1726882464.16706: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 25052 1726882464.16710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81d87d0> <<< 25052 1726882464.16742: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81d9820> <<< 25052 1726882464.16765: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882464.16838: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 25052 1726882464.16864: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8226210> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81db680> <<< 25052 1726882464.17077: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 25052 1726882464.38203: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2967, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 564, "free": 2967}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 654, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794611200, "block_size": 4096, "block_total": 65519099, "block_available": 63914700, "block_used": 1604399, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "24", "epoch": "1726882464", "epoch_int": "1726882464", "date": "2024-09-20", "time": "21:34:24", "iso8601_micro": "2024-09-21T01:34:24.370166Z", "iso8601": "2024-09-21T01:34:24Z", "iso8601_basic": "20240920T213424370166", "iso8601_basic_short": "20240920T213424", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.43994140625, "5m": 0.4462890625, "15m": 0.24169921875}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25052 1726882464.38447: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 25052 1726882464.38481: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site <<< 25052 1726882464.38511: stdout chunk (state=3): >>># cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 25052 1726882464.38611: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing <<< 25052 1726882464.38638: stdout chunk (state=3): >>># cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 25052 1726882464.38688: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version <<< 25052 1726882464.38814: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly <<< 25052 1726882464.38836: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 25052 1726882464.39091: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25052 1726882464.39120: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 25052 1726882464.39155: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 25052 1726882464.39179: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 25052 1726882464.39228: stdout chunk (state=3): >>># destroy ntpath <<< 25052 1726882464.39512: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 25052 1726882464.39531: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 25052 1726882464.39559: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 25052 1726882464.39597: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 25052 1726882464.39622: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 25052 1726882464.39688: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 25052 1726882464.39705: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 25052 1726882464.39733: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 25052 1726882464.39759: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 25052 1726882464.39785: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 25052 1726882464.39825: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 25052 1726882464.39851: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 25052 1726882464.39872: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25052 1726882464.40010: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 25052 1726882464.40028: stdout chunk (state=3): >>># destroy _collections <<< 25052 1726882464.40053: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25052 1726882464.40079: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25052 1726882464.40114: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 25052 1726882464.40164: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 25052 1726882464.40177: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25052 1726882464.40280: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 25052 1726882464.40315: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 25052 1726882464.40522: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 25052 1726882464.40738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882464.40798: stderr chunk (state=3): >>><<< 25052 1726882464.40801: stdout chunk (state=3): >>><<< 25052 1726882464.41239: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e93104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e92dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90c1fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90ffda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fffb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9137770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9137e00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9117a40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9115160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fcf50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91576b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91562d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9116030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e9154b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918c6b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90fc1d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e918cb60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918ca10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e918cdd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e90facf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918d4c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918d190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918e3c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91a85c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91a9d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91aaba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91ab200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91aa0f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e91abc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e91ab3b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918e330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8eafbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed9010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed99d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8eadd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8edad80> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8ed9880> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e918eae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f070b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f27440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f88200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f8a960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f88320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f55220> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8d95370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8f26240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8edbcb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff3e8f265a0> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_iwkz6vn1/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dfb110> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dda000> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dd9160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8df8fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2ab40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a8d0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a1e0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2a930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8dfbda0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2b890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8e2ba40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8e2bf50> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8729d90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e872b9b0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872c350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872d4f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872ff80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8ed82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872e240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8737da0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8736870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e87365d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8736b40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e872e750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e877bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e877dbe0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877d9a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e87801d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877e2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8783980> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8780380> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784a10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784b90> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8784b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e877c2c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8610290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8611550> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8786a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8787dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8786660> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e86156a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86163c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8787530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86164b0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8617680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8622180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e861d8b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e870a960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e87fe630> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8621ee0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8784e60> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b62d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e4140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e43b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86a31d0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b6e40> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b4980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b45f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e7470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e6d50> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e82e6f00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e6180> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e75c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e834a030> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e82e61b0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e86b5a60> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e834be90> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e834aba0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8382240> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8372e70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8395ac0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8397440> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff3e8192660> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8193410> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e818bfb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81d87d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81d9820> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e8226210> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff3e81db680> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2967, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 564, "free": 2967}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 654, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794611200, "block_size": 4096, "block_total": 65519099, "block_available": 63914700, "block_used": 1604399, "inode_total": 131070960, "inode_available": 131029052, "inode_used": 41908, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "24", "epoch": "1726882464", "epoch_int": "1726882464", "date": "2024-09-20", "time": "21:34:24", "iso8601_micro": "2024-09-21T01:34:24.370166Z", "iso8601": "2024-09-21T01:34:24Z", "iso8601_basic": "20240920T213424370166", "iso8601_basic_short": "20240920T213424", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.43994140625, "5m": 0.4462890625, "15m": 0.24169921875}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 25052 1726882464.44699: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882464.44702: _low_level_execute_command(): starting 25052 1726882464.44705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882463.0935752-25076-71300421287250/ > /dev/null 2>&1 && sleep 0' 25052 1726882464.45910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882464.45956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882464.45992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.46052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882464.47907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882464.47920: stdout chunk (state=3): >>><<< 25052 1726882464.47933: stderr chunk (state=3): >>><<< 25052 1726882464.48018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882464.48034: handler run complete 25052 1726882464.48198: variable 'ansible_facts' from source: unknown 25052 1726882464.48391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.49110: variable 'ansible_facts' from source: unknown 25052 1726882464.49197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.49523: attempt loop complete, returning result 25052 1726882464.49532: _execute() done 25052 1726882464.49539: dumping result to json 25052 1726882464.49573: done dumping result, returning 25052 1726882464.49614: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-f7f6-4a6d-0000000000b9] 25052 1726882464.49799: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b9 25052 1726882464.50724: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b9 25052 1726882464.50727: WORKER PROCESS EXITING ok: [managed_node2] 25052 1726882464.51289: no more pending results, returning what we have 25052 1726882464.51292: results queue empty 25052 1726882464.51294: checking for any_errors_fatal 25052 1726882464.51296: done checking for any_errors_fatal 25052 1726882464.51297: checking for max_fail_percentage 25052 1726882464.51298: done checking for max_fail_percentage 25052 1726882464.51299: checking to see if all hosts have failed and the running result is not ok 25052 1726882464.51300: done checking to see if all hosts have failed 25052 1726882464.51301: getting the remaining hosts for this loop 25052 1726882464.51302: done getting the remaining hosts for this loop 25052 1726882464.51305: getting the next task for host managed_node2 25052 1726882464.51311: done getting next task for host managed_node2 25052 1726882464.51313: ^ task is: TASK: meta (flush_handlers) 25052 1726882464.51314: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882464.51318: getting variables 25052 1726882464.51320: in VariableManager get_vars() 25052 1726882464.51340: Calling all_inventory to load vars for managed_node2 25052 1726882464.51343: Calling groups_inventory to load vars for managed_node2 25052 1726882464.51347: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882464.51356: Calling all_plugins_play to load vars for managed_node2 25052 1726882464.51359: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882464.51363: Calling groups_plugins_play to load vars for managed_node2 25052 1726882464.51735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.52128: done with get_vars() 25052 1726882464.52139: done getting variables 25052 1726882464.52406: in VariableManager get_vars() 25052 1726882464.52417: Calling all_inventory to load vars for managed_node2 25052 1726882464.52420: Calling groups_inventory to load vars for managed_node2 25052 1726882464.52423: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882464.52428: Calling all_plugins_play to load vars for managed_node2 25052 1726882464.52430: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882464.52432: Calling groups_plugins_play to load vars for managed_node2 25052 1726882464.52569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.52941: done with get_vars() 25052 1726882464.52955: done queuing things up, now waiting for results queue to drain 25052 1726882464.52957: results queue empty 25052 1726882464.52958: checking for any_errors_fatal 25052 1726882464.52961: done checking for any_errors_fatal 25052 1726882464.52962: checking for max_fail_percentage 25052 1726882464.52963: done checking for max_fail_percentage 25052 1726882464.52964: checking to see if all hosts have failed and the running result is not ok 25052 1726882464.52964: done checking to see if all hosts have failed 25052 1726882464.52970: getting the remaining hosts for this loop 25052 1726882464.52971: done getting the remaining hosts for this loop 25052 1726882464.52974: getting the next task for host managed_node2 25052 1726882464.52979: done getting next task for host managed_node2 25052 1726882464.52981: ^ task is: TASK: Include the task 'el_repo_setup.yml' 25052 1726882464.52983: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882464.52985: getting variables 25052 1726882464.52986: in VariableManager get_vars() 25052 1726882464.53198: Calling all_inventory to load vars for managed_node2 25052 1726882464.53201: Calling groups_inventory to load vars for managed_node2 25052 1726882464.53203: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882464.53208: Calling all_plugins_play to load vars for managed_node2 25052 1726882464.53210: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882464.53213: Calling groups_plugins_play to load vars for managed_node2 25052 1726882464.53345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.53728: done with get_vars() 25052 1726882464.53736: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:11 Friday 20 September 2024 21:34:24 -0400 (0:00:01.484) 0:00:01.495 ****** 25052 1726882464.54020: entering _queue_task() for managed_node2/include_tasks 25052 1726882464.54022: Creating lock for include_tasks 25052 1726882464.54551: worker is 1 (out of 1 available) 25052 1726882464.54565: exiting _queue_task() for managed_node2/include_tasks 25052 1726882464.54576: done queuing things up, now waiting for results queue to drain 25052 1726882464.54578: waiting for pending results... 25052 1726882464.55072: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 25052 1726882464.55211: in run() - task 12673a56-9f93-f7f6-4a6d-000000000006 25052 1726882464.55334: variable 'ansible_search_path' from source: unknown 25052 1726882464.55339: calling self._execute() 25052 1726882464.55503: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882464.55515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882464.55529: variable 'omit' from source: magic vars 25052 1726882464.55844: _execute() done 25052 1726882464.55899: dumping result to json 25052 1726882464.55902: done dumping result, returning 25052 1726882464.55904: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-f7f6-4a6d-000000000006] 25052 1726882464.55928: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000006 25052 1726882464.56166: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000006 25052 1726882464.56179: WORKER PROCESS EXITING 25052 1726882464.56239: no more pending results, returning what we have 25052 1726882464.56244: in VariableManager get_vars() 25052 1726882464.56274: Calling all_inventory to load vars for managed_node2 25052 1726882464.56277: Calling groups_inventory to load vars for managed_node2 25052 1726882464.56280: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882464.56292: Calling all_plugins_play to load vars for managed_node2 25052 1726882464.56297: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882464.56299: Calling groups_plugins_play to load vars for managed_node2 25052 1726882464.56676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.56872: done with get_vars() 25052 1726882464.56879: variable 'ansible_search_path' from source: unknown 25052 1726882464.56895: we have included files to process 25052 1726882464.56896: generating all_blocks data 25052 1726882464.56897: done generating all_blocks data 25052 1726882464.56898: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25052 1726882464.56899: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25052 1726882464.56902: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 25052 1726882464.57515: in VariableManager get_vars() 25052 1726882464.57529: done with get_vars() 25052 1726882464.57540: done processing included file 25052 1726882464.57542: iterating over new_blocks loaded from include file 25052 1726882464.57559: in VariableManager get_vars() 25052 1726882464.57570: done with get_vars() 25052 1726882464.57571: filtering new block on tags 25052 1726882464.57584: done filtering new block on tags 25052 1726882464.57587: in VariableManager get_vars() 25052 1726882464.57597: done with get_vars() 25052 1726882464.57599: filtering new block on tags 25052 1726882464.57613: done filtering new block on tags 25052 1726882464.57616: in VariableManager get_vars() 25052 1726882464.57625: done with get_vars() 25052 1726882464.57627: filtering new block on tags 25052 1726882464.57638: done filtering new block on tags 25052 1726882464.57640: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 25052 1726882464.57645: extending task lists for all hosts with included blocks 25052 1726882464.57688: done extending task lists 25052 1726882464.57689: done processing included files 25052 1726882464.57690: results queue empty 25052 1726882464.57691: checking for any_errors_fatal 25052 1726882464.57692: done checking for any_errors_fatal 25052 1726882464.57695: checking for max_fail_percentage 25052 1726882464.57696: done checking for max_fail_percentage 25052 1726882464.57696: checking to see if all hosts have failed and the running result is not ok 25052 1726882464.57697: done checking to see if all hosts have failed 25052 1726882464.57698: getting the remaining hosts for this loop 25052 1726882464.57699: done getting the remaining hosts for this loop 25052 1726882464.57701: getting the next task for host managed_node2 25052 1726882464.57705: done getting next task for host managed_node2 25052 1726882464.57707: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 25052 1726882464.57709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882464.57711: getting variables 25052 1726882464.57712: in VariableManager get_vars() 25052 1726882464.57720: Calling all_inventory to load vars for managed_node2 25052 1726882464.57722: Calling groups_inventory to load vars for managed_node2 25052 1726882464.57724: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882464.57729: Calling all_plugins_play to load vars for managed_node2 25052 1726882464.57731: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882464.57733: Calling groups_plugins_play to load vars for managed_node2 25052 1726882464.57906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882464.58135: done with get_vars() 25052 1726882464.58143: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:34:24 -0400 (0:00:00.042) 0:00:01.537 ****** 25052 1726882464.58222: entering _queue_task() for managed_node2/setup 25052 1726882464.58500: worker is 1 (out of 1 available) 25052 1726882464.58512: exiting _queue_task() for managed_node2/setup 25052 1726882464.58523: done queuing things up, now waiting for results queue to drain 25052 1726882464.58524: waiting for pending results... 25052 1726882464.58914: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 25052 1726882464.58919: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000ca 25052 1726882464.58922: variable 'ansible_search_path' from source: unknown 25052 1726882464.58924: variable 'ansible_search_path' from source: unknown 25052 1726882464.58926: calling self._execute() 25052 1726882464.58971: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882464.58983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882464.58999: variable 'omit' from source: magic vars 25052 1726882464.59519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882464.62950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882464.63147: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882464.63220: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882464.63328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882464.63426: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882464.63621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882464.63661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882464.63690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882464.63760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882464.63845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882464.64260: variable 'ansible_facts' from source: unknown 25052 1726882464.64311: variable 'network_test_required_facts' from source: task vars 25052 1726882464.64478: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 25052 1726882464.64585: variable 'omit' from source: magic vars 25052 1726882464.64590: variable 'omit' from source: magic vars 25052 1726882464.64594: variable 'omit' from source: magic vars 25052 1726882464.64596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882464.64801: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882464.64805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882464.64807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882464.64811: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882464.64931: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882464.64935: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882464.64937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882464.65260: Set connection var ansible_pipelining to False 25052 1726882464.65263: Set connection var ansible_connection to ssh 25052 1726882464.65265: Set connection var ansible_shell_type to sh 25052 1726882464.65267: Set connection var ansible_timeout to 10 25052 1726882464.65277: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882464.65285: Set connection var ansible_shell_executable to /bin/sh 25052 1726882464.65397: variable 'ansible_shell_executable' from source: unknown 25052 1726882464.65478: variable 'ansible_connection' from source: unknown 25052 1726882464.65481: variable 'ansible_module_compression' from source: unknown 25052 1726882464.65483: variable 'ansible_shell_type' from source: unknown 25052 1726882464.65485: variable 'ansible_shell_executable' from source: unknown 25052 1726882464.65487: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882464.65490: variable 'ansible_pipelining' from source: unknown 25052 1726882464.65492: variable 'ansible_timeout' from source: unknown 25052 1726882464.65497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882464.65682: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882464.65762: variable 'omit' from source: magic vars 25052 1726882464.65772: starting attempt loop 25052 1726882464.65798: running the handler 25052 1726882464.66209: _low_level_execute_command(): starting 25052 1726882464.66212: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882464.67697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882464.67782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882464.67802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882464.67826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882464.67916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882464.68145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.68225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882464.70685: stdout chunk (state=3): >>>/root <<< 25052 1726882464.71267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882464.71270: stdout chunk (state=3): >>><<< 25052 1726882464.71272: stderr chunk (state=3): >>><<< 25052 1726882464.71275: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882464.71284: _low_level_execute_command(): starting 25052 1726882464.71286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602 `" && echo ansible-tmp-1726882464.712044-25130-185683903057602="` echo /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602 `" ) && sleep 0' 25052 1726882464.72602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882464.72648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882464.72652: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.72728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882464.75223: stdout chunk (state=3): >>>ansible-tmp-1726882464.712044-25130-185683903057602=/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602 <<< 25052 1726882464.75364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882464.75398: stderr chunk (state=3): >>><<< 25052 1726882464.75439: stdout chunk (state=3): >>><<< 25052 1726882464.75462: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882464.712044-25130-185683903057602=/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882464.75609: variable 'ansible_module_compression' from source: unknown 25052 1726882464.75721: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25052 1726882464.75944: variable 'ansible_facts' from source: unknown 25052 1726882464.76419: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py 25052 1726882464.76758: Sending initial data 25052 1726882464.76859: Sent initial data (153 bytes) 25052 1726882464.77912: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882464.78108: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882464.78126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882464.78149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.78242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882464.80495: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882464.80539: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882464.80625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpo1tnvdpk /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py <<< 25052 1726882464.80634: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py" <<< 25052 1726882464.80728: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpo1tnvdpk" to remote "/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py" <<< 25052 1726882464.83614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882464.83630: stdout chunk (state=3): >>><<< 25052 1726882464.83647: stderr chunk (state=3): >>><<< 25052 1726882464.83855: done transferring module to remote 25052 1726882464.83858: _low_level_execute_command(): starting 25052 1726882464.83861: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/ /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py && sleep 0' 25052 1726882464.85014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882464.85029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882464.85045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882464.85174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882464.85283: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882464.85325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.85413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882464.87986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882464.88005: stdout chunk (state=3): >>><<< 25052 1726882464.88046: stderr chunk (state=3): >>><<< 25052 1726882464.88296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882464.88300: _low_level_execute_command(): starting 25052 1726882464.88303: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/AnsiballZ_setup.py && sleep 0' 25052 1726882464.89420: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882464.89434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882464.89447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882464.89456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882464.89506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882464.89665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882464.89690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882464.89778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882464.93028: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 25052 1726882464.93085: stdout chunk (state=3): >>>import 'posix' # <<< 25052 1726882464.93150: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook<<< 25052 1726882464.93177: stdout chunk (state=3): >>> import 'time' # <<< 25052 1726882464.93296: stdout chunk (state=3): >>> <<< 25052 1726882464.93300: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook<<< 25052 1726882464.93326: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882464.93349: stdout chunk (state=3): >>>import '_codecs' # <<< 25052 1726882464.93408: stdout chunk (state=3): >>>import 'codecs' # <<< 25052 1726882464.93505: stdout chunk (state=3): >>> <<< 25052 1726882464.93521: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab54184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab53e7b30> <<< 25052 1726882464.93546: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 25052 1726882464.93809: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab541aa50><<< 25052 1726882464.93827: stdout chunk (state=3): >>> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 25052 1726882464.93895: stdout chunk (state=3): >>>import '_collections_abc' # <<< 25052 1726882464.93927: stdout chunk (state=3): >>>import 'genericpath' # <<< 25052 1726882464.94075: stdout chunk (state=3): >>> import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 25052 1726882464.94099: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 25052 1726882464.94147: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 25052 1726882464.94179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 25052 1726882464.94200: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab51c9130><<< 25052 1726882464.94296: stdout chunk (state=3): >>> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 25052 1726882464.94325: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab51c9fa0> <<< 25052 1726882464.94383: stdout chunk (state=3): >>>import 'site' # <<< 25052 1726882464.94415: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 25052 1726882464.94521: stdout chunk (state=3): >>> <<< 25052 1726882464.94524: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 25052 1726882464.95087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25052 1726882464.95111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 25052 1726882464.95162: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 25052 1726882464.95278: stdout chunk (state=3): >>> <<< 25052 1726882464.95286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 25052 1726882464.95310: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 25052 1726882464.95340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 25052 1726882464.95489: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5207e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5207f20> <<< 25052 1726882464.95547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 25052 1726882464.95586: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 25052 1726882464.95605: stdout chunk (state=3): >>> <<< 25052 1726882464.95660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882464.95691: stdout chunk (state=3): >>>import 'itertools' # <<< 25052 1726882464.95728: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 25052 1726882464.95934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab523f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25052 1726882464.95937: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab523ff20> import '_collections' # <<< 25052 1726882464.95940: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521fb30> import '_functools' # <<< 25052 1726882464.96028: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521d250> <<< 25052 1726882464.96122: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5205010><<< 25052 1726882464.96215: stdout chunk (state=3): >>> <<< 25052 1726882464.96235: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 25052 1726882464.96256: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25052 1726882464.96282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 25052 1726882464.96306: stdout chunk (state=3): >>> <<< 25052 1726882464.96342: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 25052 1726882464.96425: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525f800> <<< 25052 1726882464.96430: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525e450> <<< 25052 1726882464.96506: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 25052 1726882464.96517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525ccb0><<< 25052 1726882464.96572: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 25052 1726882464.96585: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 25052 1726882464.96706: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5294860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5204290><<< 25052 1726882464.96710: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.96724: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5294bc0><<< 25052 1726882464.96797: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.96911: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5202db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882464.96914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 25052 1726882464.97008: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 25052 1726882464.97011: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52956a0> <<< 25052 1726882464.97013: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5295370> <<< 25052 1726882464.97146: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52965a0> import 'importlib.util' # <<< 25052 1726882464.97149: stdout chunk (state=3): >>> import 'runpy' # <<< 25052 1726882464.97188: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 25052 1726882464.97245: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 25052 1726882464.97351: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 25052 1726882464.97354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52ac7a0> import 'errno' # <<< 25052 1726882464.97380: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.97411: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 25052 1726882464.97538: stdout chunk (state=3): >>> <<< 25052 1726882464.97591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.97596: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52ae270><<< 25052 1726882464.97641: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 25052 1726882464.97669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 25052 1726882464.97766: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 25052 1726882464.97769: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.97782: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52afda0><<< 25052 1726882464.97879: stdout chunk (state=3): >>> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5296510> <<< 25052 1726882464.97910: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 25052 1726882464.98001: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.98121: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fa3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcc740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcc4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcc680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25052 1726882464.98211: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.98422: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fccfe0> <<< 25052 1726882464.98563: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882464.98590: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcc8c0> <<< 25052 1726882464.98618: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fa1d90> <<< 25052 1726882464.98648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 25052 1726882464.98714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 25052 1726882464.98753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 25052 1726882464.98765: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fced20> <<< 25052 1726882464.98818: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcda60> <<< 25052 1726882464.98831: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5296750> <<< 25052 1726882464.99031: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4ff7080> <<< 25052 1726882464.99110: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 25052 1726882464.99184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882464.99208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25052 1726882464.99256: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab501b440> <<< 25052 1726882464.99301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25052 1726882464.99358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 25052 1726882464.99438: stdout chunk (state=3): >>>import 'ntpath' # <<< 25052 1726882464.99466: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507c230> <<< 25052 1726882464.99510: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 25052 1726882464.99541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 25052 1726882464.99632: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 25052 1726882464.99644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 25052 1726882464.99772: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507e990> <<< 25052 1726882464.99874: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507c350> <<< 25052 1726882464.99984: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5049250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 25052 1726882464.99987: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4929310> <<< 25052 1726882465.00238: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab501a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcfc50> <<< 25052 1726882465.00391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7ab49295b0> <<< 25052 1726882465.00734: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_xo_7xx4d/ansible_setup_payload.zip' <<< 25052 1726882465.00737: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.00937: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.00975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 25052 1726882465.00991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25052 1726882465.01053: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25052 1726882465.01161: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 25052 1726882465.01230: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 25052 1726882465.01234: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4992f90> <<< 25052 1726882465.01308: stdout chunk (state=3): >>>import '_typing' # <<< 25052 1726882465.01516: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4971e80> <<< 25052 1726882465.01532: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49710a0> <<< 25052 1726882465.01588: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.01611: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 25052 1726882465.01828: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.01832: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 25052 1726882465.03349: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.04464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4991280> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 25052 1726882465.04513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 25052 1726882465.04540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c2990> <<< 25052 1726882465.04611: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2030> <<< 25052 1726882465.04684: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 25052 1726882465.04782: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4993c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c3770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c3980> <<< 25052 1726882465.04787: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 25052 1726882465.04852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 25052 1726882465.05004: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 25052 1726882465.05019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab482dca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab482f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 25052 1726882465.05278: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4830290> <<< 25052 1726882465.05328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4831400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4833e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5202ea0> <<< 25052 1726882465.05475: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4832180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25052 1726882465.05488: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 25052 1726882465.05630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483be60> <<< 25052 1726882465.05651: stdout chunk (state=3): >>>import '_tokenize' # <<< 25052 1726882465.05855: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483a690> <<< 25052 1726882465.05878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25052 1726882465.05901: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483ac00> <<< 25052 1726882465.05956: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4832660> <<< 25052 1726882465.05997: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.06011: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab487ff80> <<< 25052 1726882465.06229: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4880260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4881d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4881ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 25052 1726882465.06254: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.06309: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4884290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab48823c0> <<< 25052 1726882465.06418: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 25052 1726882465.06424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 25052 1726882465.06445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 25052 1726882465.06582: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4887a70> <<< 25052 1726882465.06802: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4884440> <<< 25052 1726882465.06881: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.06899: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab48887d0> <<< 25052 1726882465.06945: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4888c20> <<< 25052 1726882465.07037: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.07067: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4888ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4880470> <<< 25052 1726882465.07170: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25052 1726882465.07173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 25052 1726882465.07322: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4710380> <<< 25052 1726882465.07510: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 25052 1726882465.07532: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 25052 1726882465.07547: stdout chunk (state=3): >>> import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab47115e0> <<< 25052 1726882465.07610: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab488ab40> <<< 25052 1726882465.07614: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.07648: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab488bec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab488a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 25052 1726882465.07681: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.07760: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.07862: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.07909: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 25052 1726882465.07953: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 25052 1726882465.08072: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.08259: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.08996: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.09443: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4719730> <<< 25052 1726882465.09519: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 25052 1726882465.09548: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471a4b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4711460> <<< 25052 1726882465.09609: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 25052 1726882465.09647: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 25052 1726882465.09678: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.09900: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.10126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471a480> <<< 25052 1726882465.10186: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.10855: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.11563: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.11667: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.11782: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 25052 1726882465.12099: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 25052 1726882465.12102: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.12238: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 25052 1726882465.12402: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.12699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25052 1726882465.12702: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25052 1726882465.12704: stdout chunk (state=3): >>>import '_ast' # <<< 25052 1726882465.12763: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471b680> <<< 25052 1726882465.12920: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 25052 1726882465.12940: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 25052 1726882465.12999: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.13030: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 25052 1726882465.13038: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.13231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.13244: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25052 1726882465.13299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.13361: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.13398: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4726030> <<< 25052 1726882465.13404: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4721070> <<< 25052 1726882465.13463: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 25052 1726882465.14003: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.14008: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14010: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14013: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 25052 1726882465.14015: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.14017: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25052 1726882465.14020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 25052 1726882465.14022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25052 1726882465.14024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25052 1726882465.14042: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab480ea20> <<< 25052 1726882465.14073: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49ee6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4726150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471b140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 25052 1726882465.14103: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.14129: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 25052 1726882465.14203: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14254: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14278: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.14314: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14388: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.14435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 25052 1726882465.14639: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 25052 1726882465.14817: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.14982: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.15028: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.15073: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 25052 1726882465.15100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 25052 1726882465.15131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 25052 1726882465.15168: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 25052 1726882465.15190: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b6360> <<< 25052 1726882465.15211: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 25052 1726882465.15262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 25052 1726882465.15319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 25052 1726882465.15322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d01a0> <<< 25052 1726882465.15418: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.15423: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d0590> <<< 25052 1726882465.15425: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47a3350> <<< 25052 1726882465.15474: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b6ea0> <<< 25052 1726882465.15563: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4a10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 25052 1726882465.15588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 25052 1726882465.15632: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d3470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d2d20> <<< 25052 1726882465.15654: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d2f00> <<< 25052 1726882465.15708: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d2150> <<< 25052 1726882465.15719: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 25052 1726882465.15860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 25052 1726882465.15864: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d34d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 25052 1726882465.15891: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab442e000> <<< 25052 1726882465.16007: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d3fe0> <<< 25052 1726882465.16020: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4740> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 25052 1726882465.16054: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 25052 1726882465.16168: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16267: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 25052 1726882465.16400: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.16404: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 25052 1726882465.16406: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16430: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 25052 1726882465.16450: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16480: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 25052 1726882465.16566: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16588: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16674: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16720: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.16816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 25052 1726882465.17237: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.17666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 25052 1726882465.17818: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.17887: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 25052 1726882465.17962: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # <<< 25052 1726882465.18069: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.18072: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # <<< 25052 1726882465.18499: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 25052 1726882465.18523: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442fe60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 25052 1726882465.18550: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442ec90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 25052 1726882465.18611: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.18677: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 25052 1726882465.18850: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.18975: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 25052 1726882465.19016: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 25052 1726882465.19053: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.19100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 25052 1726882465.19137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 25052 1726882465.19381: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.19442: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab446e360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab445f140> <<< 25052 1726882465.19465: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 25052 1726882465.19514: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.19608: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 25052 1726882465.19714: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.19909: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.19990: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 25052 1726882465.20014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 25052 1726882465.20049: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.20174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 25052 1726882465.20314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4481e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442e990> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.20352: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 25052 1726882465.20373: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.20621: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.20658: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 25052 1726882465.20671: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.20960: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 25052 1726882465.20981: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21005: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21019: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21156: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 25052 1726882465.21427: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 25052 1726882465.21596: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.21626: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.22300: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.22676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 25052 1726882465.22701: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 25052 1726882465.22804: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.22905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 25052 1726882465.23050: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 25052 1726882465.23168: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23289: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 25052 1726882465.23453: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 25052 1726882465.23503: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23558: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 25052 1726882465.23641: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23738: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.23936: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 25052 1726882465.24159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 25052 1726882465.24179: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24198: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 25052 1726882465.24269: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24297: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 25052 1726882465.24436: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 25052 1726882465.24597: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 25052 1726882465.24634: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 25052 1726882465.24665: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 25052 1726882465.24749: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.24983: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 25052 1726882465.25299: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25365: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 25052 1726882465.25589: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 25052 1726882465.25605: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25644: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 25052 1726882465.25646: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25752: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 25052 1726882465.25950: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 25052 1726882465.25953: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.25981: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.26220: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.26235: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 25052 1726882465.26238: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 25052 1726882465.26320: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.26389: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 25052 1726882465.26613: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.26742: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 25052 1726882465.26801: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.26878: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 25052 1726882465.26901: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.26951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 25052 1726882465.26966: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.27045: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.27125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 25052 1726882465.27146: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.27218: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.27313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 25052 1726882465.27409: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.28034: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 25052 1726882465.28170: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 25052 1726882465.28174: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab427e900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab427d1c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4275550> <<< 25052 1726882465.28819: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "25", "epoch": "1726882465", "epoch_int": "1726882465", "date": "2024-09-20", "time": "21:34:25", "iso8601_micro": "2024-09-21T01:34:25.278660Z", "iso8601": "2024-09-21T01:34:25Z", "iso8601_basic": "20240920T213425278660", "iso8601_basic_short": "20240920T213425", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": <<< 25052 1726882465.28852: stdout chunk (state=3): >>>"Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25052 1726882465.29383: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 25052 1726882465.29413: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 25052 1726882465.29434: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 25052 1726882465.29460: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external <<< 25052 1726882465.29504: stdout chunk (state=3): >>># cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 25052 1726882465.29532: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 25052 1726882465.29568: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 <<< 25052 1726882465.29588: stdout chunk (state=3): >>># cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext <<< 25052 1726882465.29626: stdout chunk (state=3): >>># cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time<<< 25052 1726882465.29664: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix <<< 25052 1726882465.29681: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution <<< 25052 1726882465.29720: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd <<< 25052 1726882465.29730: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 25052 1726882465.30052: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 25052 1726882465.30101: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 25052 1726882465.30115: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 25052 1726882465.30191: stdout chunk (state=3): >>># destroy ntpath <<< 25052 1726882465.30198: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 25052 1726882465.30223: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 25052 1726882465.30296: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 25052 1726882465.30299: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 25052 1726882465.30345: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array <<< 25052 1726882465.30399: stdout chunk (state=3): >>># destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 25052 1726882465.30430: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 25052 1726882465.30501: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 25052 1726882465.30505: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 25052 1726882465.30522: stdout chunk (state=3): >>># destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 25052 1726882465.30591: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 25052 1726882465.30649: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 25052 1726882465.30684: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 25052 1726882465.30696: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections <<< 25052 1726882465.30745: stdout chunk (state=3): >>># cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 25052 1726882465.30767: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25052 1726882465.30900: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 25052 1726882465.30938: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25052 1726882465.30998: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25052 1726882465.31081: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25052 1726882465.31242: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 25052 1726882465.31413: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 25052 1726882465.31622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882465.31663: stderr chunk (state=3): >>><<< 25052 1726882465.31671: stdout chunk (state=3): >>><<< 25052 1726882465.32003: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab54184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab53e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab541aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab51c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab51c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5207e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5207f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab523f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab523ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5205010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab521e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab525ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5294860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5204290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5294bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5202db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5295370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52965a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52ae270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab52afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab52af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5296510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fa3bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcc740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcc4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcc680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fccfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4fcd910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcc8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fa1d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fced20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcda60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5296750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4ff7080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab501b440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab507c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab5049250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4929310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab501a240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4fcfc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7ab49295b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_xo_7xx4d/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4992f90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4971e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49710a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4991280> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c2480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4993c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c3770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab49c3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49c3e90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab482dca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab482f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4830290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4831400> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4833e90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab5202ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4832180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483be60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483a930> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483a690> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab483ac00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4832660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab487ff80> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4880260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4881d00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4881ac0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4884290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab48823c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4887a70> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4884440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab48887d0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4888c20> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4888ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4880470> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4710380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab47115e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab488ab40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab488bec0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab488a780> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4719730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471a4b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4711460> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471a480> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471b680> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4726030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4721070> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab480ea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab49ee6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4726150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab471b140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b6360> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d01a0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d0590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47a3350> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b6ea0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4a10> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d3470> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d2d20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab43d2f00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d34d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab442e000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab43d3fe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab47b4740> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442fe60> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442ec90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab446e360> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab445f140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab4481e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab442e990> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7ab427e900> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab427d1c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7ab4275550> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "25", "epoch": "1726882465", "epoch_int": "1726882465", "date": "2024-09-20", "time": "21:34:25", "iso8601_micro": "2024-09-21T01:34:25.278660Z", "iso8601": "2024-09-21T01:34:25Z", "iso8601_basic": "20240920T213425278660", "iso8601_basic_short": "20240920T213425", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 25052 1726882465.33595: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882465.33599: _low_level_execute_command(): starting 25052 1726882465.33601: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882464.712044-25130-185683903057602/ > /dev/null 2>&1 && sleep 0' 25052 1726882465.33844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882465.33986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.34083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882465.34089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882465.34096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.34187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.36596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.36617: stderr chunk (state=3): >>><<< 25052 1726882465.36621: stdout chunk (state=3): >>><<< 25052 1726882465.36636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882465.36643: handler run complete 25052 1726882465.36673: variable 'ansible_facts' from source: unknown 25052 1726882465.36726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882465.36797: variable 'ansible_facts' from source: unknown 25052 1726882465.36830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882465.36864: attempt loop complete, returning result 25052 1726882465.36867: _execute() done 25052 1726882465.36869: dumping result to json 25052 1726882465.36878: done dumping result, returning 25052 1726882465.36885: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-f7f6-4a6d-0000000000ca] 25052 1726882465.36890: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ca 25052 1726882465.37072: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ca 25052 1726882465.37075: WORKER PROCESS EXITING ok: [managed_node2] 25052 1726882465.37174: no more pending results, returning what we have 25052 1726882465.37177: results queue empty 25052 1726882465.37177: checking for any_errors_fatal 25052 1726882465.37178: done checking for any_errors_fatal 25052 1726882465.37179: checking for max_fail_percentage 25052 1726882465.37181: done checking for max_fail_percentage 25052 1726882465.37181: checking to see if all hosts have failed and the running result is not ok 25052 1726882465.37182: done checking to see if all hosts have failed 25052 1726882465.37183: getting the remaining hosts for this loop 25052 1726882465.37184: done getting the remaining hosts for this loop 25052 1726882465.37189: getting the next task for host managed_node2 25052 1726882465.37223: done getting next task for host managed_node2 25052 1726882465.37226: ^ task is: TASK: Check if system is ostree 25052 1726882465.37229: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882465.37232: getting variables 25052 1726882465.37233: in VariableManager get_vars() 25052 1726882465.37253: Calling all_inventory to load vars for managed_node2 25052 1726882465.37255: Calling groups_inventory to load vars for managed_node2 25052 1726882465.37258: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882465.37300: Calling all_plugins_play to load vars for managed_node2 25052 1726882465.37302: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882465.37304: Calling groups_plugins_play to load vars for managed_node2 25052 1726882465.37440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882465.37633: done with get_vars() 25052 1726882465.37644: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:34:25 -0400 (0:00:00.795) 0:00:02.332 ****** 25052 1726882465.37743: entering _queue_task() for managed_node2/stat 25052 1726882465.38013: worker is 1 (out of 1 available) 25052 1726882465.38025: exiting _queue_task() for managed_node2/stat 25052 1726882465.38037: done queuing things up, now waiting for results queue to drain 25052 1726882465.38038: waiting for pending results... 25052 1726882465.38411: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 25052 1726882465.38418: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000cc 25052 1726882465.38421: variable 'ansible_search_path' from source: unknown 25052 1726882465.38424: variable 'ansible_search_path' from source: unknown 25052 1726882465.38427: calling self._execute() 25052 1726882465.38600: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.38604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.38606: variable 'omit' from source: magic vars 25052 1726882465.39171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882465.39598: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882465.39657: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882465.39710: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882465.39755: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882465.39864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882465.39913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882465.39949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882465.39988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882465.40201: Evaluated conditional (not __network_is_ostree is defined): True 25052 1726882465.40205: variable 'omit' from source: magic vars 25052 1726882465.40210: variable 'omit' from source: magic vars 25052 1726882465.40267: variable 'omit' from source: magic vars 25052 1726882465.40302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882465.40340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882465.40376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882465.40419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882465.40423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882465.40466: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882465.40486: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.40489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.40745: Set connection var ansible_pipelining to False 25052 1726882465.40748: Set connection var ansible_connection to ssh 25052 1726882465.40750: Set connection var ansible_shell_type to sh 25052 1726882465.40753: Set connection var ansible_timeout to 10 25052 1726882465.40755: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882465.40756: Set connection var ansible_shell_executable to /bin/sh 25052 1726882465.40762: variable 'ansible_shell_executable' from source: unknown 25052 1726882465.40769: variable 'ansible_connection' from source: unknown 25052 1726882465.40775: variable 'ansible_module_compression' from source: unknown 25052 1726882465.40781: variable 'ansible_shell_type' from source: unknown 25052 1726882465.40788: variable 'ansible_shell_executable' from source: unknown 25052 1726882465.40801: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.40809: variable 'ansible_pipelining' from source: unknown 25052 1726882465.40816: variable 'ansible_timeout' from source: unknown 25052 1726882465.40825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.40981: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882465.41003: variable 'omit' from source: magic vars 25052 1726882465.41156: starting attempt loop 25052 1726882465.41160: running the handler 25052 1726882465.41162: _low_level_execute_command(): starting 25052 1726882465.41165: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882465.41927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882465.41944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882465.41963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882465.42074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882465.42089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882465.42117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.42205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.44301: stdout chunk (state=3): >>>/root <<< 25052 1726882465.44454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.44476: stderr chunk (state=3): >>><<< 25052 1726882465.44479: stdout chunk (state=3): >>><<< 25052 1726882465.44505: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882465.44519: _low_level_execute_command(): starting 25052 1726882465.44525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420 `" && echo ansible-tmp-1726882465.4450421-25170-113167778769420="` echo /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420 `" ) && sleep 0' 25052 1726882465.44972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882465.44976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882465.44978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882465.44981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882465.44983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.45031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882465.45048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.45109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.47830: stdout chunk (state=3): >>>ansible-tmp-1726882465.4450421-25170-113167778769420=/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420 <<< 25052 1726882465.47998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.48024: stderr chunk (state=3): >>><<< 25052 1726882465.48028: stdout chunk (state=3): >>><<< 25052 1726882465.48043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882465.4450421-25170-113167778769420=/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882465.48086: variable 'ansible_module_compression' from source: unknown 25052 1726882465.48138: ANSIBALLZ: Using lock for stat 25052 1726882465.48142: ANSIBALLZ: Acquiring lock 25052 1726882465.48144: ANSIBALLZ: Lock acquired: 140207138427904 25052 1726882465.48146: ANSIBALLZ: Creating module 25052 1726882465.55988: ANSIBALLZ: Writing module into payload 25052 1726882465.56053: ANSIBALLZ: Writing module 25052 1726882465.56073: ANSIBALLZ: Renaming module 25052 1726882465.56078: ANSIBALLZ: Done creating module 25052 1726882465.56097: variable 'ansible_facts' from source: unknown 25052 1726882465.56141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py 25052 1726882465.56242: Sending initial data 25052 1726882465.56245: Sent initial data (153 bytes) 25052 1726882465.56716: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882465.56719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.56722: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882465.56724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.56774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882465.56778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882465.56784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.56856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.59160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882465.59230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882465.59296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpbhfp9_bk /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py <<< 25052 1726882465.59303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py" <<< 25052 1726882465.59368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpbhfp9_bk" to remote "/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py" <<< 25052 1726882465.59370: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py" <<< 25052 1726882465.60021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.60034: stderr chunk (state=3): >>><<< 25052 1726882465.60042: stdout chunk (state=3): >>><<< 25052 1726882465.60063: done transferring module to remote 25052 1726882465.60077: _low_level_execute_command(): starting 25052 1726882465.60080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/ /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py && sleep 0' 25052 1726882465.60704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.60758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.63462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.63478: stderr chunk (state=3): >>><<< 25052 1726882465.63521: stdout chunk (state=3): >>><<< 25052 1726882465.63576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882465.63586: _low_level_execute_command(): starting 25052 1726882465.63707: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/AnsiballZ_stat.py && sleep 0' 25052 1726882465.64591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882465.64648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882465.64700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.64766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882465.64820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.64906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882465.67638: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 25052 1726882465.67647: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 25052 1726882465.67670: stdout chunk (state=3): >>>import 'posix' # <<< 25052 1726882465.67704: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 25052 1726882465.67728: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 25052 1726882465.67852: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # <<< 25052 1726882465.67877: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb86184d0> <<< 25052 1726882465.67921: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85e7b30> <<< 25052 1726882465.68030: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb861aa50> <<< 25052 1726882465.68034: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 25052 1726882465.68117: stdout chunk (state=3): >>>import '_collections_abc' # <<< 25052 1726882465.68137: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 25052 1726882465.68162: stdout chunk (state=3): >>>import 'os' # <<< 25052 1726882465.68191: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 25052 1726882465.68316: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 25052 1726882465.68321: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb842d130> <<< 25052 1726882465.68396: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb842dfa0> <<< 25052 1726882465.68421: stdout chunk (state=3): >>>import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 25052 1726882465.68625: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 25052 1726882465.68951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.68958: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 25052 1726882465.68963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb846be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb846bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.68990: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84a3830> <<< 25052 1726882465.69037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 25052 1726882465.69041: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84a3ec0> import '_collections' # <<< 25052 1726882465.69097: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8483b60> import '_functools' # <<< 25052 1726882465.69225: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8481280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8469040> <<< 25052 1726882465.69244: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 25052 1726882465.69269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 25052 1726882465.69295: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 25052 1726882465.69317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 25052 1726882465.69341: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 25052 1726882465.69388: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c37d0> <<< 25052 1726882465.69431: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c23f0> <<< 25052 1726882465.69436: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 25052 1726882465.69440: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c0c20> <<< 25052 1726882465.69502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 25052 1726882465.69578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84682c0> <<< 25052 1726882465.69583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 25052 1726882465.69633: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb84f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb84f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8466de0> <<< 25052 1726882465.69645: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.69710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 25052 1726882465.69782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f92e0> import 'importlib.machinery' # <<< 25052 1726882465.69825: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 25052 1726882465.70015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8510710> <<< 25052 1726882465.70021: stdout chunk (state=3): >>>import 'errno' # <<< 25052 1726882465.70024: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb8511df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8512c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb85132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85121e0> <<< 25052 1726882465.70220: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb8513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 25052 1726882465.70250: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb829fbf0> <<< 25052 1726882465.70274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 25052 1726882465.70304: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.70333: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.70369: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 25052 1726882465.70386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 25052 1726882465.70464: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.70580: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c9040> <<< 25052 1726882465.70675: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.70704: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c99a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c88f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb829dd90> <<< 25052 1726882465.70750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 25052 1726882465.70773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82cadb0> <<< 25052 1726882465.70806: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c9af0> <<< 25052 1726882465.70823: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fac30> <<< 25052 1726882465.70923: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 25052 1726882465.70951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 25052 1726882465.70963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 25052 1726882465.71048: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82f7110> <<< 25052 1726882465.71058: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 25052 1726882465.71104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 25052 1726882465.71151: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb83174a0> <<< 25052 1726882465.71166: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 25052 1726882465.71445: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8378260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 25052 1726882465.71597: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb837a9c0> <<< 25052 1726882465.71713: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8378380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8341280> <<< 25052 1726882465.71755: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8181340> <<< 25052 1726882465.71771: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb83162a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82cbce0> <<< 25052 1726882465.72037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcdb81815b0> <<< 25052 1726882465.72180: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_gqwz8f_7/ansible_stat_payload.zip' # zipimport: zlib available <<< 25052 1726882465.72398: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.72432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 25052 1726882465.72470: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 25052 1726882465.72522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 25052 1726882465.72638: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 25052 1726882465.72641: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d70b0> <<< 25052 1726882465.72669: stdout chunk (state=3): >>>import '_typing' # <<< 25052 1726882465.72939: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81b5fa0> <<< 25052 1726882465.72950: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81b5160> # zipimport: zlib available <<< 25052 1726882465.72985: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 25052 1726882465.73021: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.73044: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 25052 1726882465.74908: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.76556: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d4f80> <<< 25052 1726882465.76560: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 25052 1726882465.76607: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 25052 1726882465.76610: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 25052 1726882465.76666: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.76669: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81fe9c0> <<< 25052 1726882465.76709: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe750> <<< 25052 1726882465.76753: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe060> <<< 25052 1726882465.76767: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 25052 1726882465.76803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 25052 1726882465.76820: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe7e0> <<< 25052 1726882465.76844: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d7d40> import 'atexit' # <<< 25052 1726882465.76923: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81ff740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.76942: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81ff980> <<< 25052 1726882465.76959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 25052 1726882465.77015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 25052 1726882465.77096: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81ffec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 25052 1726882465.77126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 25052 1726882465.77202: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b11ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b138c0> <<< 25052 1726882465.77240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 25052 1726882465.77290: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b142c0> <<< 25052 1726882465.77386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b15460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 25052 1726882465.77424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 25052 1726882465.77522: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b17f20> <<< 25052 1726882465.77588: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c82f0> <<< 25052 1726882465.77595: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b161e0> <<< 25052 1726882465.77627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 25052 1726882465.77643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 25052 1726882465.77681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 25052 1726882465.77744: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 25052 1726882465.77761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 25052 1726882465.77787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1fce0> <<< 25052 1726882465.77826: stdout chunk (state=3): >>>import '_tokenize' # <<< 25052 1726882465.77881: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1e7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1e510> <<< 25052 1726882465.77907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 25052 1726882465.78110: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b15010> <<< 25052 1726882465.78120: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b166f0> <<< 25052 1726882465.78142: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b67f50> <<< 25052 1726882465.78261: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b680e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 25052 1726882465.78480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b69bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b69970> <<< 25052 1726882465.78484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.78519: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b6c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 25052 1726882465.78570: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.78622: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 25052 1726882465.78625: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 25052 1726882465.78715: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6f830> <<< 25052 1726882465.78820: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6c200> <<< 25052 1726882465.78927: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.78942: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b705f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b70830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b70b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b682c0> <<< 25052 1726882465.78974: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 25052 1726882465.79050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 25052 1726882465.79054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 25052 1726882465.79063: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7bfc1d0> <<< 25052 1726882465.79207: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7bfd400> <<< 25052 1726882465.79240: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b72990> <<< 25052 1726882465.79389: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b73d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b725d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 25052 1726882465.79394: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.79486: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.79518: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.79541: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 25052 1726882465.79664: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.79799: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.80339: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.80896: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.80936: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 25052 1726882465.80947: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7a016a0> <<< 25052 1726882465.81207: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 25052 1726882465.81229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a024b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7bfd670> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 25052 1726882465.81275: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.81438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 25052 1726882465.81459: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a02270> # zipimport: zlib available <<< 25052 1726882465.81898: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.82353: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.82505: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.82570: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 25052 1726882465.82631: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.82730: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 25052 1726882465.82767: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 25052 1726882465.82796: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.82864: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 25052 1726882465.83089: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 25052 1726882465.83351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 25052 1726882465.83368: stdout chunk (state=3): >>>import '_ast' # <<< 25052 1726882465.83434: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a03620> # zipimport: zlib available <<< 25052 1726882465.83598: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83629: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 25052 1726882465.83650: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83701: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 25052 1726882465.83755: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83790: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83854: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.83911: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 25052 1726882465.83955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.84051: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7a0e090> <<< 25052 1726882465.84087: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a08fb0> <<< 25052 1726882465.84168: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 25052 1726882465.84226: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.84258: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.84307: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 25052 1726882465.84327: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 25052 1726882465.84374: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 25052 1726882465.84427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 25052 1726882465.84513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 25052 1726882465.84516: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8252ab0> <<< 25052 1726882465.84607: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8246780> <<< 25052 1726882465.84645: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a0e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a03020> # destroy ansible.module_utils.distro <<< 25052 1726882465.84657: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 25052 1726882465.84671: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.84706: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 25052 1726882465.84798: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 25052 1726882465.84813: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 25052 1726882465.84832: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.85017: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.85109: stdout chunk (state=3): >>># zipimport: zlib available <<< 25052 1726882465.85243: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 25052 1726882465.85532: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 25052 1726882465.85570: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout <<< 25052 1726882465.85665: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix <<< 25052 1726882465.85732: stdout chunk (state=3): >>># cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize <<< 25052 1726882465.85736: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 25052 1726882465.85933: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 25052 1726882465.85999: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib <<< 25052 1726882465.86128: stdout chunk (state=3): >>># destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 25052 1726882465.86139: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 25052 1726882465.86272: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 25052 1726882465.86337: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 25052 1726882465.86351: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 25052 1726882465.86545: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 25052 1726882465.86551: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 25052 1726882465.86639: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 25052 1726882465.86704: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 25052 1726882465.86737: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 25052 1726882465.87102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.87148: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 25052 1726882465.87159: stderr chunk (state=3): >>><<< 25052 1726882465.87209: stdout chunk (state=3): >>><<< 25052 1726882465.87280: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb86184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb861aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb842d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb842dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb846be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb846bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8481280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8469040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb84f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb84f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8466de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8510710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb8511df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8512c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb85132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb8513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb85134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb829fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c86e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c8710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c9040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c99a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c88f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb829dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82cadb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82c9af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb84fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82f7110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb83174a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8378260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb837a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8378380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8341280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8181340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb83162a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb82cbce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcdb81815b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_gqwz8f_7/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d70b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81b5fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81b5160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d4f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81fe9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81fe7e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81d7d40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81ff740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb81ff980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb81ffec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b11ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b15460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b17f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb82c82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b161e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1fce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1e7b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b1e510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b15010> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b166f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b67f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b680e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b69bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b69970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b6c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6f830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b6c200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b705f0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b70830> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b70b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7bfc1d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7bfd400> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b72990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7b73d10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7b725d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7a016a0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a024b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7bfd670> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a02270> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a03620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcdb7a0e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a08fb0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8252ab0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb8246780> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a0e1e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcdb7a03020> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 25052 1726882465.88405: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882465.88408: _low_level_execute_command(): starting 25052 1726882465.88411: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882465.4450421-25170-113167778769420/ > /dev/null 2>&1 && sleep 0' 25052 1726882465.88600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882465.88619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882465.88632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882465.88670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882465.88681: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882465.88707: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882465.88734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882465.88808: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882465.88833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882465.88972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882465.90860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882465.90864: stderr chunk (state=3): >>><<< 25052 1726882465.91000: stdout chunk (state=3): >>><<< 25052 1726882465.91004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882465.91007: handler run complete 25052 1726882465.91009: attempt loop complete, returning result 25052 1726882465.91012: _execute() done 25052 1726882465.91155: dumping result to json 25052 1726882465.91158: done dumping result, returning 25052 1726882465.91161: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [12673a56-9f93-f7f6-4a6d-0000000000cc] 25052 1726882465.91163: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cc ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 25052 1726882465.91360: no more pending results, returning what we have 25052 1726882465.91363: results queue empty 25052 1726882465.91364: checking for any_errors_fatal 25052 1726882465.91371: done checking for any_errors_fatal 25052 1726882465.91371: checking for max_fail_percentage 25052 1726882465.91373: done checking for max_fail_percentage 25052 1726882465.91374: checking to see if all hosts have failed and the running result is not ok 25052 1726882465.91375: done checking to see if all hosts have failed 25052 1726882465.91376: getting the remaining hosts for this loop 25052 1726882465.91377: done getting the remaining hosts for this loop 25052 1726882465.91380: getting the next task for host managed_node2 25052 1726882465.91386: done getting next task for host managed_node2 25052 1726882465.91389: ^ task is: TASK: Set flag to indicate system is ostree 25052 1726882465.91392: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882465.91500: getting variables 25052 1726882465.91502: in VariableManager get_vars() 25052 1726882465.91540: Calling all_inventory to load vars for managed_node2 25052 1726882465.91543: Calling groups_inventory to load vars for managed_node2 25052 1726882465.91547: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882465.91558: Calling all_plugins_play to load vars for managed_node2 25052 1726882465.91561: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882465.91564: Calling groups_plugins_play to load vars for managed_node2 25052 1726882465.92239: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cc 25052 1726882465.92243: WORKER PROCESS EXITING 25052 1726882465.92266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882465.92769: done with get_vars() 25052 1726882465.92865: done getting variables 25052 1726882465.92961: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:34:25 -0400 (0:00:00.553) 0:00:02.885 ****** 25052 1726882465.93110: entering _queue_task() for managed_node2/set_fact 25052 1726882465.93112: Creating lock for set_fact 25052 1726882465.93664: worker is 1 (out of 1 available) 25052 1726882465.93675: exiting _queue_task() for managed_node2/set_fact 25052 1726882465.93685: done queuing things up, now waiting for results queue to drain 25052 1726882465.93686: waiting for pending results... 25052 1726882465.94042: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 25052 1726882465.94237: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000cd 25052 1726882465.94249: variable 'ansible_search_path' from source: unknown 25052 1726882465.94252: variable 'ansible_search_path' from source: unknown 25052 1726882465.94285: calling self._execute() 25052 1726882465.94387: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.94392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.94405: variable 'omit' from source: magic vars 25052 1726882465.95376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882465.95918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882465.96076: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882465.96113: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882465.96286: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882465.96378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882465.96517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882465.96542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882465.96568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882465.96827: Evaluated conditional (not __network_is_ostree is defined): True 25052 1726882465.96863: variable 'omit' from source: magic vars 25052 1726882465.97102: variable 'omit' from source: magic vars 25052 1726882465.97105: variable '__ostree_booted_stat' from source: set_fact 25052 1726882465.97131: variable 'omit' from source: magic vars 25052 1726882465.97160: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882465.97187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882465.97210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882465.97228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882465.97238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882465.97303: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882465.97306: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.97309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.97439: Set connection var ansible_pipelining to False 25052 1726882465.97443: Set connection var ansible_connection to ssh 25052 1726882465.97445: Set connection var ansible_shell_type to sh 25052 1726882465.97447: Set connection var ansible_timeout to 10 25052 1726882465.97450: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882465.97452: Set connection var ansible_shell_executable to /bin/sh 25052 1726882465.97454: variable 'ansible_shell_executable' from source: unknown 25052 1726882465.97457: variable 'ansible_connection' from source: unknown 25052 1726882465.97459: variable 'ansible_module_compression' from source: unknown 25052 1726882465.97461: variable 'ansible_shell_type' from source: unknown 25052 1726882465.97462: variable 'ansible_shell_executable' from source: unknown 25052 1726882465.97464: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.97466: variable 'ansible_pipelining' from source: unknown 25052 1726882465.97468: variable 'ansible_timeout' from source: unknown 25052 1726882465.97470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.97643: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882465.97654: variable 'omit' from source: magic vars 25052 1726882465.97657: starting attempt loop 25052 1726882465.97659: running the handler 25052 1726882465.97661: handler run complete 25052 1726882465.97663: attempt loop complete, returning result 25052 1726882465.97664: _execute() done 25052 1726882465.97666: dumping result to json 25052 1726882465.97668: done dumping result, returning 25052 1726882465.97669: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [12673a56-9f93-f7f6-4a6d-0000000000cd] 25052 1726882465.97671: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cd 25052 1726882465.97730: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cd 25052 1726882465.97732: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 25052 1726882465.97907: no more pending results, returning what we have 25052 1726882465.97910: results queue empty 25052 1726882465.97911: checking for any_errors_fatal 25052 1726882465.97917: done checking for any_errors_fatal 25052 1726882465.97918: checking for max_fail_percentage 25052 1726882465.97919: done checking for max_fail_percentage 25052 1726882465.97920: checking to see if all hosts have failed and the running result is not ok 25052 1726882465.97921: done checking to see if all hosts have failed 25052 1726882465.97921: getting the remaining hosts for this loop 25052 1726882465.97922: done getting the remaining hosts for this loop 25052 1726882465.97925: getting the next task for host managed_node2 25052 1726882465.97932: done getting next task for host managed_node2 25052 1726882465.97934: ^ task is: TASK: Fix CentOS6 Base repo 25052 1726882465.97937: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882465.97940: getting variables 25052 1726882465.97941: in VariableManager get_vars() 25052 1726882465.97967: Calling all_inventory to load vars for managed_node2 25052 1726882465.97970: Calling groups_inventory to load vars for managed_node2 25052 1726882465.97973: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882465.97982: Calling all_plugins_play to load vars for managed_node2 25052 1726882465.97985: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882465.97996: Calling groups_plugins_play to load vars for managed_node2 25052 1726882465.98411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882465.98729: done with get_vars() 25052 1726882465.98740: done getting variables 25052 1726882465.98874: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:34:25 -0400 (0:00:00.058) 0:00:02.943 ****** 25052 1726882465.98913: entering _queue_task() for managed_node2/copy 25052 1726882465.99198: worker is 1 (out of 1 available) 25052 1726882465.99215: exiting _queue_task() for managed_node2/copy 25052 1726882465.99230: done queuing things up, now waiting for results queue to drain 25052 1726882465.99232: waiting for pending results... 25052 1726882465.99451: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 25052 1726882465.99655: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000cf 25052 1726882465.99659: variable 'ansible_search_path' from source: unknown 25052 1726882465.99662: variable 'ansible_search_path' from source: unknown 25052 1726882465.99665: calling self._execute() 25052 1726882465.99668: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882465.99670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882465.99673: variable 'omit' from source: magic vars 25052 1726882466.00164: variable 'ansible_distribution' from source: facts 25052 1726882466.00184: Evaluated conditional (ansible_distribution == 'CentOS'): True 25052 1726882466.00312: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.00315: Evaluated conditional (ansible_distribution_major_version == '6'): False 25052 1726882466.00318: when evaluation is False, skipping this task 25052 1726882466.00321: _execute() done 25052 1726882466.00336: dumping result to json 25052 1726882466.00339: done dumping result, returning 25052 1726882466.00342: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [12673a56-9f93-f7f6-4a6d-0000000000cf] 25052 1726882466.00344: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cf 25052 1726882466.00647: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000cf 25052 1726882466.00650: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25052 1726882466.00706: no more pending results, returning what we have 25052 1726882466.00709: results queue empty 25052 1726882466.00710: checking for any_errors_fatal 25052 1726882466.00715: done checking for any_errors_fatal 25052 1726882466.00715: checking for max_fail_percentage 25052 1726882466.00717: done checking for max_fail_percentage 25052 1726882466.00718: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.00718: done checking to see if all hosts have failed 25052 1726882466.00719: getting the remaining hosts for this loop 25052 1726882466.00720: done getting the remaining hosts for this loop 25052 1726882466.00723: getting the next task for host managed_node2 25052 1726882466.00728: done getting next task for host managed_node2 25052 1726882466.00730: ^ task is: TASK: Include the task 'enable_epel.yml' 25052 1726882466.00733: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.00736: getting variables 25052 1726882466.00738: in VariableManager get_vars() 25052 1726882466.00846: Calling all_inventory to load vars for managed_node2 25052 1726882466.00899: Calling groups_inventory to load vars for managed_node2 25052 1726882466.00904: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.00912: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.00915: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.00917: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.01266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.01684: done with get_vars() 25052 1726882466.01818: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:34:26 -0400 (0:00:00.030) 0:00:02.974 ****** 25052 1726882466.02007: entering _queue_task() for managed_node2/include_tasks 25052 1726882466.02548: worker is 1 (out of 1 available) 25052 1726882466.02560: exiting _queue_task() for managed_node2/include_tasks 25052 1726882466.02571: done queuing things up, now waiting for results queue to drain 25052 1726882466.02572: waiting for pending results... 25052 1726882466.02912: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 25052 1726882466.02999: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000d0 25052 1726882466.03003: variable 'ansible_search_path' from source: unknown 25052 1726882466.03006: variable 'ansible_search_path' from source: unknown 25052 1726882466.03008: calling self._execute() 25052 1726882466.03071: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.03081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.03097: variable 'omit' from source: magic vars 25052 1726882466.03569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882466.05931: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882466.05978: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882466.06020: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882466.06057: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882466.06078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882466.06156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882466.06178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882466.06227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882466.06241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882466.06256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882466.06368: variable '__network_is_ostree' from source: set_fact 25052 1726882466.06382: Evaluated conditional (not __network_is_ostree | d(false)): True 25052 1726882466.06388: _execute() done 25052 1726882466.06399: dumping result to json 25052 1726882466.06405: done dumping result, returning 25052 1726882466.06408: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-f7f6-4a6d-0000000000d0] 25052 1726882466.06599: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000d0 25052 1726882466.06672: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000d0 25052 1726882466.06675: WORKER PROCESS EXITING 25052 1726882466.06708: no more pending results, returning what we have 25052 1726882466.06713: in VariableManager get_vars() 25052 1726882466.06751: Calling all_inventory to load vars for managed_node2 25052 1726882466.06754: Calling groups_inventory to load vars for managed_node2 25052 1726882466.06758: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.06769: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.06772: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.06775: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.06979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.07174: done with get_vars() 25052 1726882466.07182: variable 'ansible_search_path' from source: unknown 25052 1726882466.07183: variable 'ansible_search_path' from source: unknown 25052 1726882466.07222: we have included files to process 25052 1726882466.07223: generating all_blocks data 25052 1726882466.07224: done generating all_blocks data 25052 1726882466.07229: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25052 1726882466.07230: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25052 1726882466.07232: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 25052 1726882466.08340: done processing included file 25052 1726882466.08342: iterating over new_blocks loaded from include file 25052 1726882466.08343: in VariableManager get_vars() 25052 1726882466.08356: done with get_vars() 25052 1726882466.08358: filtering new block on tags 25052 1726882466.08385: done filtering new block on tags 25052 1726882466.08388: in VariableManager get_vars() 25052 1726882466.08405: done with get_vars() 25052 1726882466.08407: filtering new block on tags 25052 1726882466.08421: done filtering new block on tags 25052 1726882466.08423: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 25052 1726882466.08429: extending task lists for all hosts with included blocks 25052 1726882466.08537: done extending task lists 25052 1726882466.08538: done processing included files 25052 1726882466.08539: results queue empty 25052 1726882466.08540: checking for any_errors_fatal 25052 1726882466.08543: done checking for any_errors_fatal 25052 1726882466.08543: checking for max_fail_percentage 25052 1726882466.08544: done checking for max_fail_percentage 25052 1726882466.08545: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.08546: done checking to see if all hosts have failed 25052 1726882466.08546: getting the remaining hosts for this loop 25052 1726882466.08547: done getting the remaining hosts for this loop 25052 1726882466.08549: getting the next task for host managed_node2 25052 1726882466.08553: done getting next task for host managed_node2 25052 1726882466.08555: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 25052 1726882466.08558: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.08560: getting variables 25052 1726882466.08561: in VariableManager get_vars() 25052 1726882466.08569: Calling all_inventory to load vars for managed_node2 25052 1726882466.08571: Calling groups_inventory to load vars for managed_node2 25052 1726882466.08573: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.08579: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.08585: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.08588: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.08735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.08883: done with get_vars() 25052 1726882466.08895: done getting variables 25052 1726882466.08956: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882466.09153: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:34:26 -0400 (0:00:00.071) 0:00:03.046 ****** 25052 1726882466.09203: entering _queue_task() for managed_node2/command 25052 1726882466.09205: Creating lock for command 25052 1726882466.09521: worker is 1 (out of 1 available) 25052 1726882466.09533: exiting _queue_task() for managed_node2/command 25052 1726882466.09544: done queuing things up, now waiting for results queue to drain 25052 1726882466.09545: waiting for pending results... 25052 1726882466.09798: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 25052 1726882466.09982: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000ea 25052 1726882466.10019: variable 'ansible_search_path' from source: unknown 25052 1726882466.10053: variable 'ansible_search_path' from source: unknown 25052 1726882466.10094: calling self._execute() 25052 1726882466.10200: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.10204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.10207: variable 'omit' from source: magic vars 25052 1726882466.10683: variable 'ansible_distribution' from source: facts 25052 1726882466.10687: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25052 1726882466.10753: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.10759: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25052 1726882466.10762: when evaluation is False, skipping this task 25052 1726882466.10765: _execute() done 25052 1726882466.10768: dumping result to json 25052 1726882466.10770: done dumping result, returning 25052 1726882466.10777: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [12673a56-9f93-f7f6-4a6d-0000000000ea] 25052 1726882466.10785: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ea 25052 1726882466.10900: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ea 25052 1726882466.10903: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25052 1726882466.10968: no more pending results, returning what we have 25052 1726882466.10972: results queue empty 25052 1726882466.10972: checking for any_errors_fatal 25052 1726882466.10974: done checking for any_errors_fatal 25052 1726882466.10974: checking for max_fail_percentage 25052 1726882466.10976: done checking for max_fail_percentage 25052 1726882466.10976: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.10977: done checking to see if all hosts have failed 25052 1726882466.10978: getting the remaining hosts for this loop 25052 1726882466.10979: done getting the remaining hosts for this loop 25052 1726882466.10982: getting the next task for host managed_node2 25052 1726882466.10987: done getting next task for host managed_node2 25052 1726882466.10990: ^ task is: TASK: Install yum-utils package 25052 1726882466.10995: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.11000: getting variables 25052 1726882466.11002: in VariableManager get_vars() 25052 1726882466.11034: Calling all_inventory to load vars for managed_node2 25052 1726882466.11037: Calling groups_inventory to load vars for managed_node2 25052 1726882466.11040: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.11052: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.11055: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.11057: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.11183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.11320: done with get_vars() 25052 1726882466.11326: done getting variables 25052 1726882466.11395: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:34:26 -0400 (0:00:00.022) 0:00:03.069 ****** 25052 1726882466.11414: entering _queue_task() for managed_node2/package 25052 1726882466.11415: Creating lock for package 25052 1726882466.11606: worker is 1 (out of 1 available) 25052 1726882466.11620: exiting _queue_task() for managed_node2/package 25052 1726882466.11631: done queuing things up, now waiting for results queue to drain 25052 1726882466.11632: waiting for pending results... 25052 1726882466.11786: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 25052 1726882466.11954: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000eb 25052 1726882466.11958: variable 'ansible_search_path' from source: unknown 25052 1726882466.11960: variable 'ansible_search_path' from source: unknown 25052 1726882466.11964: calling self._execute() 25052 1726882466.12032: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.12043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.12059: variable 'omit' from source: magic vars 25052 1726882466.12441: variable 'ansible_distribution' from source: facts 25052 1726882466.12459: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25052 1726882466.12687: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.12690: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25052 1726882466.12698: when evaluation is False, skipping this task 25052 1726882466.12701: _execute() done 25052 1726882466.12704: dumping result to json 25052 1726882466.12706: done dumping result, returning 25052 1726882466.12708: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [12673a56-9f93-f7f6-4a6d-0000000000eb] 25052 1726882466.12711: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000eb skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25052 1726882466.12859: no more pending results, returning what we have 25052 1726882466.12862: results queue empty 25052 1726882466.12862: checking for any_errors_fatal 25052 1726882466.12868: done checking for any_errors_fatal 25052 1726882466.12869: checking for max_fail_percentage 25052 1726882466.12871: done checking for max_fail_percentage 25052 1726882466.12871: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.12872: done checking to see if all hosts have failed 25052 1726882466.12873: getting the remaining hosts for this loop 25052 1726882466.12874: done getting the remaining hosts for this loop 25052 1726882466.12877: getting the next task for host managed_node2 25052 1726882466.12882: done getting next task for host managed_node2 25052 1726882466.12884: ^ task is: TASK: Enable EPEL 7 25052 1726882466.12888: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.12890: getting variables 25052 1726882466.12895: in VariableManager get_vars() 25052 1726882466.12917: Calling all_inventory to load vars for managed_node2 25052 1726882466.12919: Calling groups_inventory to load vars for managed_node2 25052 1726882466.12922: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.12930: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.12933: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.12935: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.13066: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000eb 25052 1726882466.13070: WORKER PROCESS EXITING 25052 1726882466.13081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.13197: done with get_vars() 25052 1726882466.13204: done getting variables 25052 1726882466.13249: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:34:26 -0400 (0:00:00.018) 0:00:03.087 ****** 25052 1726882466.13269: entering _queue_task() for managed_node2/command 25052 1726882466.13439: worker is 1 (out of 1 available) 25052 1726882466.13451: exiting _queue_task() for managed_node2/command 25052 1726882466.13462: done queuing things up, now waiting for results queue to drain 25052 1726882466.13463: waiting for pending results... 25052 1726882466.13609: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 25052 1726882466.13672: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000ec 25052 1726882466.13681: variable 'ansible_search_path' from source: unknown 25052 1726882466.13686: variable 'ansible_search_path' from source: unknown 25052 1726882466.13715: calling self._execute() 25052 1726882466.13770: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.13774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.13782: variable 'omit' from source: magic vars 25052 1726882466.14042: variable 'ansible_distribution' from source: facts 25052 1726882466.14059: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25052 1726882466.14142: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.14146: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25052 1726882466.14149: when evaluation is False, skipping this task 25052 1726882466.14151: _execute() done 25052 1726882466.14158: dumping result to json 25052 1726882466.14160: done dumping result, returning 25052 1726882466.14171: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [12673a56-9f93-f7f6-4a6d-0000000000ec] 25052 1726882466.14174: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ec 25052 1726882466.14250: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ec 25052 1726882466.14252: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25052 1726882466.14298: no more pending results, returning what we have 25052 1726882466.14302: results queue empty 25052 1726882466.14302: checking for any_errors_fatal 25052 1726882466.14307: done checking for any_errors_fatal 25052 1726882466.14308: checking for max_fail_percentage 25052 1726882466.14309: done checking for max_fail_percentage 25052 1726882466.14310: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.14311: done checking to see if all hosts have failed 25052 1726882466.14311: getting the remaining hosts for this loop 25052 1726882466.14312: done getting the remaining hosts for this loop 25052 1726882466.14315: getting the next task for host managed_node2 25052 1726882466.14320: done getting next task for host managed_node2 25052 1726882466.14322: ^ task is: TASK: Enable EPEL 8 25052 1726882466.14326: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.14329: getting variables 25052 1726882466.14330: in VariableManager get_vars() 25052 1726882466.14352: Calling all_inventory to load vars for managed_node2 25052 1726882466.14354: Calling groups_inventory to load vars for managed_node2 25052 1726882466.14357: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.14364: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.14367: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.14369: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.14504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.14615: done with get_vars() 25052 1726882466.14621: done getting variables 25052 1726882466.14656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:34:26 -0400 (0:00:00.014) 0:00:03.101 ****** 25052 1726882466.14675: entering _queue_task() for managed_node2/command 25052 1726882466.14843: worker is 1 (out of 1 available) 25052 1726882466.14855: exiting _queue_task() for managed_node2/command 25052 1726882466.14867: done queuing things up, now waiting for results queue to drain 25052 1726882466.14868: waiting for pending results... 25052 1726882466.15008: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 25052 1726882466.15066: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000ed 25052 1726882466.15075: variable 'ansible_search_path' from source: unknown 25052 1726882466.15079: variable 'ansible_search_path' from source: unknown 25052 1726882466.15113: calling self._execute() 25052 1726882466.15164: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.15168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.15176: variable 'omit' from source: magic vars 25052 1726882466.15434: variable 'ansible_distribution' from source: facts 25052 1726882466.15443: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25052 1726882466.15531: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.15536: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 25052 1726882466.15539: when evaluation is False, skipping this task 25052 1726882466.15541: _execute() done 25052 1726882466.15543: dumping result to json 25052 1726882466.15546: done dumping result, returning 25052 1726882466.15549: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [12673a56-9f93-f7f6-4a6d-0000000000ed] 25052 1726882466.15557: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ed 25052 1726882466.15634: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ed 25052 1726882466.15637: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 25052 1726882466.15682: no more pending results, returning what we have 25052 1726882466.15685: results queue empty 25052 1726882466.15686: checking for any_errors_fatal 25052 1726882466.15690: done checking for any_errors_fatal 25052 1726882466.15690: checking for max_fail_percentage 25052 1726882466.15692: done checking for max_fail_percentage 25052 1726882466.15694: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.15695: done checking to see if all hosts have failed 25052 1726882466.15696: getting the remaining hosts for this loop 25052 1726882466.15697: done getting the remaining hosts for this loop 25052 1726882466.15700: getting the next task for host managed_node2 25052 1726882466.15707: done getting next task for host managed_node2 25052 1726882466.15709: ^ task is: TASK: Enable EPEL 6 25052 1726882466.15712: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.15715: getting variables 25052 1726882466.15716: in VariableManager get_vars() 25052 1726882466.15737: Calling all_inventory to load vars for managed_node2 25052 1726882466.15739: Calling groups_inventory to load vars for managed_node2 25052 1726882466.15742: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.15750: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.15752: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.15755: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.15858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.15970: done with get_vars() 25052 1726882466.15976: done getting variables 25052 1726882466.16017: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:34:26 -0400 (0:00:00.013) 0:00:03.115 ****** 25052 1726882466.16037: entering _queue_task() for managed_node2/copy 25052 1726882466.16202: worker is 1 (out of 1 available) 25052 1726882466.16217: exiting _queue_task() for managed_node2/copy 25052 1726882466.16228: done queuing things up, now waiting for results queue to drain 25052 1726882466.16229: waiting for pending results... 25052 1726882466.16367: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 25052 1726882466.16427: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000ef 25052 1726882466.16436: variable 'ansible_search_path' from source: unknown 25052 1726882466.16439: variable 'ansible_search_path' from source: unknown 25052 1726882466.16467: calling self._execute() 25052 1726882466.16521: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.16525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.16531: variable 'omit' from source: magic vars 25052 1726882466.16772: variable 'ansible_distribution' from source: facts 25052 1726882466.16781: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 25052 1726882466.16862: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.16866: Evaluated conditional (ansible_distribution_major_version == '6'): False 25052 1726882466.16870: when evaluation is False, skipping this task 25052 1726882466.16872: _execute() done 25052 1726882466.16877: dumping result to json 25052 1726882466.16880: done dumping result, returning 25052 1726882466.16886: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [12673a56-9f93-f7f6-4a6d-0000000000ef] 25052 1726882466.16895: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ef 25052 1726882466.16973: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000ef 25052 1726882466.16976: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 25052 1726882466.17040: no more pending results, returning what we have 25052 1726882466.17043: results queue empty 25052 1726882466.17044: checking for any_errors_fatal 25052 1726882466.17047: done checking for any_errors_fatal 25052 1726882466.17048: checking for max_fail_percentage 25052 1726882466.17050: done checking for max_fail_percentage 25052 1726882466.17050: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.17051: done checking to see if all hosts have failed 25052 1726882466.17051: getting the remaining hosts for this loop 25052 1726882466.17052: done getting the remaining hosts for this loop 25052 1726882466.17055: getting the next task for host managed_node2 25052 1726882466.17061: done getting next task for host managed_node2 25052 1726882466.17063: ^ task is: TASK: Set network provider to 'nm' 25052 1726882466.17065: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.17068: getting variables 25052 1726882466.17069: in VariableManager get_vars() 25052 1726882466.17087: Calling all_inventory to load vars for managed_node2 25052 1726882466.17089: Calling groups_inventory to load vars for managed_node2 25052 1726882466.17091: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.17100: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.17104: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.17106: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.17236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.17349: done with get_vars() 25052 1726882466.17356: done getting variables 25052 1726882466.17391: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:13 Friday 20 September 2024 21:34:26 -0400 (0:00:00.013) 0:00:03.128 ****** 25052 1726882466.17413: entering _queue_task() for managed_node2/set_fact 25052 1726882466.17577: worker is 1 (out of 1 available) 25052 1726882466.17589: exiting _queue_task() for managed_node2/set_fact 25052 1726882466.17601: done queuing things up, now waiting for results queue to drain 25052 1726882466.17602: waiting for pending results... 25052 1726882466.17738: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 25052 1726882466.17784: in run() - task 12673a56-9f93-f7f6-4a6d-000000000007 25052 1726882466.17795: variable 'ansible_search_path' from source: unknown 25052 1726882466.17826: calling self._execute() 25052 1726882466.17878: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.17881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.17889: variable 'omit' from source: magic vars 25052 1726882466.17964: variable 'omit' from source: magic vars 25052 1726882466.17981: variable 'omit' from source: magic vars 25052 1726882466.18008: variable 'omit' from source: magic vars 25052 1726882466.18038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882466.18067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882466.18081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882466.18094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882466.18108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882466.18129: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882466.18132: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.18134: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.18208: Set connection var ansible_pipelining to False 25052 1726882466.18211: Set connection var ansible_connection to ssh 25052 1726882466.18213: Set connection var ansible_shell_type to sh 25052 1726882466.18219: Set connection var ansible_timeout to 10 25052 1726882466.18225: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882466.18230: Set connection var ansible_shell_executable to /bin/sh 25052 1726882466.18245: variable 'ansible_shell_executable' from source: unknown 25052 1726882466.18247: variable 'ansible_connection' from source: unknown 25052 1726882466.18250: variable 'ansible_module_compression' from source: unknown 25052 1726882466.18252: variable 'ansible_shell_type' from source: unknown 25052 1726882466.18254: variable 'ansible_shell_executable' from source: unknown 25052 1726882466.18257: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.18261: variable 'ansible_pipelining' from source: unknown 25052 1726882466.18265: variable 'ansible_timeout' from source: unknown 25052 1726882466.18267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.18364: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882466.18372: variable 'omit' from source: magic vars 25052 1726882466.18378: starting attempt loop 25052 1726882466.18381: running the handler 25052 1726882466.18391: handler run complete 25052 1726882466.18403: attempt loop complete, returning result 25052 1726882466.18406: _execute() done 25052 1726882466.18408: dumping result to json 25052 1726882466.18410: done dumping result, returning 25052 1726882466.18416: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [12673a56-9f93-f7f6-4a6d-000000000007] 25052 1726882466.18420: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000007 25052 1726882466.18488: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000007 25052 1726882466.18498: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 25052 1726882466.18549: no more pending results, returning what we have 25052 1726882466.18551: results queue empty 25052 1726882466.18552: checking for any_errors_fatal 25052 1726882466.18556: done checking for any_errors_fatal 25052 1726882466.18556: checking for max_fail_percentage 25052 1726882466.18558: done checking for max_fail_percentage 25052 1726882466.18559: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.18560: done checking to see if all hosts have failed 25052 1726882466.18560: getting the remaining hosts for this loop 25052 1726882466.18561: done getting the remaining hosts for this loop 25052 1726882466.18564: getting the next task for host managed_node2 25052 1726882466.18568: done getting next task for host managed_node2 25052 1726882466.18570: ^ task is: TASK: meta (flush_handlers) 25052 1726882466.18571: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.18576: getting variables 25052 1726882466.18577: in VariableManager get_vars() 25052 1726882466.18599: Calling all_inventory to load vars for managed_node2 25052 1726882466.18602: Calling groups_inventory to load vars for managed_node2 25052 1726882466.18606: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.18615: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.18617: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.18619: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.18717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.18825: done with get_vars() 25052 1726882466.18834: done getting variables 25052 1726882466.18874: in VariableManager get_vars() 25052 1726882466.18880: Calling all_inventory to load vars for managed_node2 25052 1726882466.18881: Calling groups_inventory to load vars for managed_node2 25052 1726882466.18883: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.18885: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.18886: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.18888: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.18968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.19215: done with get_vars() 25052 1726882466.19225: done queuing things up, now waiting for results queue to drain 25052 1726882466.19226: results queue empty 25052 1726882466.19226: checking for any_errors_fatal 25052 1726882466.19227: done checking for any_errors_fatal 25052 1726882466.19228: checking for max_fail_percentage 25052 1726882466.19228: done checking for max_fail_percentage 25052 1726882466.19229: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.19229: done checking to see if all hosts have failed 25052 1726882466.19230: getting the remaining hosts for this loop 25052 1726882466.19230: done getting the remaining hosts for this loop 25052 1726882466.19232: getting the next task for host managed_node2 25052 1726882466.19234: done getting next task for host managed_node2 25052 1726882466.19235: ^ task is: TASK: meta (flush_handlers) 25052 1726882466.19235: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.19240: getting variables 25052 1726882466.19241: in VariableManager get_vars() 25052 1726882466.19246: Calling all_inventory to load vars for managed_node2 25052 1726882466.19247: Calling groups_inventory to load vars for managed_node2 25052 1726882466.19248: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.19251: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.19252: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.19254: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.19333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.19434: done with get_vars() 25052 1726882466.19440: done getting variables 25052 1726882466.19466: in VariableManager get_vars() 25052 1726882466.19470: Calling all_inventory to load vars for managed_node2 25052 1726882466.19472: Calling groups_inventory to load vars for managed_node2 25052 1726882466.19473: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.19476: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.19477: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.19479: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.19556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.19672: done with get_vars() 25052 1726882466.19679: done queuing things up, now waiting for results queue to drain 25052 1726882466.19681: results queue empty 25052 1726882466.19681: checking for any_errors_fatal 25052 1726882466.19682: done checking for any_errors_fatal 25052 1726882466.19682: checking for max_fail_percentage 25052 1726882466.19683: done checking for max_fail_percentage 25052 1726882466.19683: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.19684: done checking to see if all hosts have failed 25052 1726882466.19684: getting the remaining hosts for this loop 25052 1726882466.19685: done getting the remaining hosts for this loop 25052 1726882466.19686: getting the next task for host managed_node2 25052 1726882466.19687: done getting next task for host managed_node2 25052 1726882466.19688: ^ task is: None 25052 1726882466.19689: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.19689: done queuing things up, now waiting for results queue to drain 25052 1726882466.19690: results queue empty 25052 1726882466.19690: checking for any_errors_fatal 25052 1726882466.19691: done checking for any_errors_fatal 25052 1726882466.19691: checking for max_fail_percentage 25052 1726882466.19692: done checking for max_fail_percentage 25052 1726882466.19694: checking to see if all hosts have failed and the running result is not ok 25052 1726882466.19695: done checking to see if all hosts have failed 25052 1726882466.19696: getting the next task for host managed_node2 25052 1726882466.19698: done getting next task for host managed_node2 25052 1726882466.19698: ^ task is: None 25052 1726882466.19699: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.19732: in VariableManager get_vars() 25052 1726882466.19747: done with get_vars() 25052 1726882466.19751: in VariableManager get_vars() 25052 1726882466.19759: done with get_vars() 25052 1726882466.19761: variable 'omit' from source: magic vars 25052 1726882466.19780: in VariableManager get_vars() 25052 1726882466.19788: done with get_vars() 25052 1726882466.19804: variable 'omit' from source: magic vars PLAY [Play for testing IPv6 config] ******************************************** 25052 1726882466.20023: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 25052 1726882466.20044: getting the remaining hosts for this loop 25052 1726882466.20045: done getting the remaining hosts for this loop 25052 1726882466.20047: getting the next task for host managed_node2 25052 1726882466.20049: done getting next task for host managed_node2 25052 1726882466.20050: ^ task is: TASK: Gathering Facts 25052 1726882466.20050: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882466.20052: getting variables 25052 1726882466.20052: in VariableManager get_vars() 25052 1726882466.20061: Calling all_inventory to load vars for managed_node2 25052 1726882466.20062: Calling groups_inventory to load vars for managed_node2 25052 1726882466.20063: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882466.20066: Calling all_plugins_play to load vars for managed_node2 25052 1726882466.20074: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882466.20076: Calling groups_plugins_play to load vars for managed_node2 25052 1726882466.20156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882466.20265: done with get_vars() 25052 1726882466.20271: done getting variables 25052 1726882466.20296: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 Friday 20 September 2024 21:34:26 -0400 (0:00:00.028) 0:00:03.157 ****** 25052 1726882466.20311: entering _queue_task() for managed_node2/gather_facts 25052 1726882466.20464: worker is 1 (out of 1 available) 25052 1726882466.20474: exiting _queue_task() for managed_node2/gather_facts 25052 1726882466.20484: done queuing things up, now waiting for results queue to drain 25052 1726882466.20485: waiting for pending results... 25052 1726882466.20622: running TaskExecutor() for managed_node2/TASK: Gathering Facts 25052 1726882466.20667: in run() - task 12673a56-9f93-f7f6-4a6d-000000000115 25052 1726882466.20678: variable 'ansible_search_path' from source: unknown 25052 1726882466.20708: calling self._execute() 25052 1726882466.20765: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.20769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.20776: variable 'omit' from source: magic vars 25052 1726882466.21024: variable 'ansible_distribution_major_version' from source: facts 25052 1726882466.21033: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882466.21047: variable 'omit' from source: magic vars 25052 1726882466.21063: variable 'omit' from source: magic vars 25052 1726882466.21086: variable 'omit' from source: magic vars 25052 1726882466.21118: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882466.21142: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882466.21164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882466.21239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882466.21249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882466.21274: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882466.21278: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.21281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.21349: Set connection var ansible_pipelining to False 25052 1726882466.21353: Set connection var ansible_connection to ssh 25052 1726882466.21355: Set connection var ansible_shell_type to sh 25052 1726882466.21360: Set connection var ansible_timeout to 10 25052 1726882466.21370: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882466.21380: Set connection var ansible_shell_executable to /bin/sh 25052 1726882466.21394: variable 'ansible_shell_executable' from source: unknown 25052 1726882466.21399: variable 'ansible_connection' from source: unknown 25052 1726882466.21402: variable 'ansible_module_compression' from source: unknown 25052 1726882466.21405: variable 'ansible_shell_type' from source: unknown 25052 1726882466.21408: variable 'ansible_shell_executable' from source: unknown 25052 1726882466.21410: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882466.21413: variable 'ansible_pipelining' from source: unknown 25052 1726882466.21415: variable 'ansible_timeout' from source: unknown 25052 1726882466.21420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882466.21545: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882466.21552: variable 'omit' from source: magic vars 25052 1726882466.21556: starting attempt loop 25052 1726882466.21559: running the handler 25052 1726882466.21571: variable 'ansible_facts' from source: unknown 25052 1726882466.21587: _low_level_execute_command(): starting 25052 1726882466.21602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882466.22109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.22113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.22116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.22118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.22172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882466.22176: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882466.22268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882466.24640: stdout chunk (state=3): >>>/root <<< 25052 1726882466.24786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882466.24818: stderr chunk (state=3): >>><<< 25052 1726882466.24821: stdout chunk (state=3): >>><<< 25052 1726882466.24842: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882466.24851: _low_level_execute_command(): starting 25052 1726882466.24858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307 `" && echo ansible-tmp-1726882466.2484078-25216-277325925625307="` echo /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307 `" ) && sleep 0' 25052 1726882466.25299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882466.25302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882466.25304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.25313: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.25315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.25363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882466.25370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882466.25436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882466.28275: stdout chunk (state=3): >>>ansible-tmp-1726882466.2484078-25216-277325925625307=/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307 <<< 25052 1726882466.28442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882466.28471: stderr chunk (state=3): >>><<< 25052 1726882466.28474: stdout chunk (state=3): >>><<< 25052 1726882466.28495: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882466.2484078-25216-277325925625307=/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882466.28520: variable 'ansible_module_compression' from source: unknown 25052 1726882466.28565: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 25052 1726882466.28617: variable 'ansible_facts' from source: unknown 25052 1726882466.28753: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py 25052 1726882466.28857: Sending initial data 25052 1726882466.28861: Sent initial data (154 bytes) 25052 1726882466.29333: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.29336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.29339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.29341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.29396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882466.29400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882466.29405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882466.29471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882466.31661: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882466.31726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882466.31795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp2j5xhuya /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py <<< 25052 1726882466.31799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py" <<< 25052 1726882466.31857: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp2j5xhuya" to remote "/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py" <<< 25052 1726882466.31860: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py" <<< 25052 1726882466.33018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882466.33055: stderr chunk (state=3): >>><<< 25052 1726882466.33058: stdout chunk (state=3): >>><<< 25052 1726882466.33076: done transferring module to remote 25052 1726882466.33085: _low_level_execute_command(): starting 25052 1726882466.33090: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/ /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py && sleep 0' 25052 1726882466.33544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882466.33547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882466.33551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.33553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882466.33555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.33612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882466.33618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882466.33621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882466.33686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882466.36121: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882466.36152: stderr chunk (state=3): >>><<< 25052 1726882466.36155: stdout chunk (state=3): >>><<< 25052 1726882466.36165: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882466.36168: _low_level_execute_command(): starting 25052 1726882466.36173: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/AnsiballZ_setup.py && sleep 0' 25052 1726882466.36623: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.36627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882466.36629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.36631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882466.36633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882466.36685: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882466.36698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882466.36702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882466.36767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.16735: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.64501953125, "5m": 0.48876953125, "15m": 0.2568359375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "d<<< 25052 1726882467.16746: stdout chunk (state=3): >>>ay": "20", "hour": "21", "minute": "34", "second": "26", "epoch": "1726882466", "epoch_int": "1726882466", "date": "2024-09-20", "time": "21:34:26", "iso8601_micro": "2024-09-21T01:34:26.787487Z", "iso8601": "2024-09-21T01:34:26Z", "iso8601_basic": "20240920T213426787487", "iso8601_basic_short": "20240920T213426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checks<<< 25052 1726882467.16761: stdout chunk (state=3): >>>umming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fix<<< 25052 1726882467.16775: stdout chunk (state=3): >>>ed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "<<< 25052 1726882467.16934: stdout chunk (state=3): >>>ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794213888, "block_size": 4096, "block_total": 65519099, "block_available": 63914603, "block_used": 1604496, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 25052 1726882467.19833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882467.19849: stderr chunk (state=3): >>><<< 25052 1726882467.19852: stdout chunk (state=3): >>><<< 25052 1726882467.19883: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.64501953125, "5m": 0.48876953125, "15m": 0.2568359375}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "34", "second": "26", "epoch": "1726882466", "epoch_int": "1726882466", "date": "2024-09-20", "time": "21:34:26", "iso8601_micro": "2024-09-21T01:34:26.787487Z", "iso8601": "2024-09-21T01:34:26Z", "iso8601_basic": "20240920T213426787487", "iso8601_basic_short": "20240920T213426", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3289, "used": 242}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 657, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794213888, "block_size": 4096, "block_total": 65519099, "block_available": 63914603, "block_used": 1604496, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882467.20085: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882467.20105: _low_level_execute_command(): starting 25052 1726882467.20109: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882466.2484078-25216-277325925625307/ > /dev/null 2>&1 && sleep 0' 25052 1726882467.20565: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882467.20570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882467.20573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.20575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.20577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.20633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.20636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882467.20641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.20711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.23215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.23239: stderr chunk (state=3): >>><<< 25052 1726882467.23243: stdout chunk (state=3): >>><<< 25052 1726882467.23254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882467.23262: handler run complete 25052 1726882467.23341: variable 'ansible_facts' from source: unknown 25052 1726882467.23407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.23583: variable 'ansible_facts' from source: unknown 25052 1726882467.23651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.23730: attempt loop complete, returning result 25052 1726882467.23734: _execute() done 25052 1726882467.23736: dumping result to json 25052 1726882467.23754: done dumping result, returning 25052 1726882467.23761: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-f7f6-4a6d-000000000115] 25052 1726882467.23766: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000115 25052 1726882467.24049: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000115 25052 1726882467.24051: WORKER PROCESS EXITING ok: [managed_node2] 25052 1726882467.24248: no more pending results, returning what we have 25052 1726882467.24250: results queue empty 25052 1726882467.24250: checking for any_errors_fatal 25052 1726882467.24251: done checking for any_errors_fatal 25052 1726882467.24252: checking for max_fail_percentage 25052 1726882467.24253: done checking for max_fail_percentage 25052 1726882467.24253: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.24254: done checking to see if all hosts have failed 25052 1726882467.24254: getting the remaining hosts for this loop 25052 1726882467.24255: done getting the remaining hosts for this loop 25052 1726882467.24257: getting the next task for host managed_node2 25052 1726882467.24261: done getting next task for host managed_node2 25052 1726882467.24262: ^ task is: TASK: meta (flush_handlers) 25052 1726882467.24263: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.24265: getting variables 25052 1726882467.24266: in VariableManager get_vars() 25052 1726882467.24289: Calling all_inventory to load vars for managed_node2 25052 1726882467.24291: Calling groups_inventory to load vars for managed_node2 25052 1726882467.24292: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.24304: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.24305: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.24307: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.24408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.24527: done with get_vars() 25052 1726882467.24535: done getting variables 25052 1726882467.24581: in VariableManager get_vars() 25052 1726882467.24589: Calling all_inventory to load vars for managed_node2 25052 1726882467.24591: Calling groups_inventory to load vars for managed_node2 25052 1726882467.24595: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.24599: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.24600: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.24602: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.24680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.24788: done with get_vars() 25052 1726882467.24799: done queuing things up, now waiting for results queue to drain 25052 1726882467.24801: results queue empty 25052 1726882467.24801: checking for any_errors_fatal 25052 1726882467.24803: done checking for any_errors_fatal 25052 1726882467.24803: checking for max_fail_percentage 25052 1726882467.24808: done checking for max_fail_percentage 25052 1726882467.24808: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.24809: done checking to see if all hosts have failed 25052 1726882467.24809: getting the remaining hosts for this loop 25052 1726882467.24810: done getting the remaining hosts for this loop 25052 1726882467.24811: getting the next task for host managed_node2 25052 1726882467.24814: done getting next task for host managed_node2 25052 1726882467.24815: ^ task is: TASK: Include the task 'show_interfaces.yml' 25052 1726882467.24816: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.24818: getting variables 25052 1726882467.24819: in VariableManager get_vars() 25052 1726882467.24827: Calling all_inventory to load vars for managed_node2 25052 1726882467.24829: Calling groups_inventory to load vars for managed_node2 25052 1726882467.24830: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.24833: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.24834: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.24836: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.24933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.25039: done with get_vars() 25052 1726882467.25045: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:9 Friday 20 September 2024 21:34:27 -0400 (0:00:01.047) 0:00:04.205 ****** 25052 1726882467.25098: entering _queue_task() for managed_node2/include_tasks 25052 1726882467.25314: worker is 1 (out of 1 available) 25052 1726882467.25327: exiting _queue_task() for managed_node2/include_tasks 25052 1726882467.25338: done queuing things up, now waiting for results queue to drain 25052 1726882467.25339: waiting for pending results... 25052 1726882467.25484: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25052 1726882467.25543: in run() - task 12673a56-9f93-f7f6-4a6d-00000000000b 25052 1726882467.25554: variable 'ansible_search_path' from source: unknown 25052 1726882467.25584: calling self._execute() 25052 1726882467.25649: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.25652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.25662: variable 'omit' from source: magic vars 25052 1726882467.25929: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.25938: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.25945: _execute() done 25052 1726882467.25948: dumping result to json 25052 1726882467.25951: done dumping result, returning 25052 1726882467.25957: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-f7f6-4a6d-00000000000b] 25052 1726882467.25962: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000b 25052 1726882467.26049: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000b 25052 1726882467.26051: WORKER PROCESS EXITING 25052 1726882467.26076: no more pending results, returning what we have 25052 1726882467.26080: in VariableManager get_vars() 25052 1726882467.26124: Calling all_inventory to load vars for managed_node2 25052 1726882467.26127: Calling groups_inventory to load vars for managed_node2 25052 1726882467.26129: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.26140: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.26142: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.26145: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.26278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.26390: done with get_vars() 25052 1726882467.26399: variable 'ansible_search_path' from source: unknown 25052 1726882467.26409: we have included files to process 25052 1726882467.26409: generating all_blocks data 25052 1726882467.26410: done generating all_blocks data 25052 1726882467.26411: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882467.26411: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882467.26413: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882467.26524: in VariableManager get_vars() 25052 1726882467.26538: done with get_vars() 25052 1726882467.26612: done processing included file 25052 1726882467.26614: iterating over new_blocks loaded from include file 25052 1726882467.26615: in VariableManager get_vars() 25052 1726882467.26625: done with get_vars() 25052 1726882467.26626: filtering new block on tags 25052 1726882467.26635: done filtering new block on tags 25052 1726882467.26637: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25052 1726882467.26640: extending task lists for all hosts with included blocks 25052 1726882467.26681: done extending task lists 25052 1726882467.26682: done processing included files 25052 1726882467.26682: results queue empty 25052 1726882467.26683: checking for any_errors_fatal 25052 1726882467.26683: done checking for any_errors_fatal 25052 1726882467.26684: checking for max_fail_percentage 25052 1726882467.26684: done checking for max_fail_percentage 25052 1726882467.26685: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.26685: done checking to see if all hosts have failed 25052 1726882467.26686: getting the remaining hosts for this loop 25052 1726882467.26687: done getting the remaining hosts for this loop 25052 1726882467.26688: getting the next task for host managed_node2 25052 1726882467.26690: done getting next task for host managed_node2 25052 1726882467.26695: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25052 1726882467.26697: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.26699: getting variables 25052 1726882467.26699: in VariableManager get_vars() 25052 1726882467.26708: Calling all_inventory to load vars for managed_node2 25052 1726882467.26710: Calling groups_inventory to load vars for managed_node2 25052 1726882467.26711: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.26714: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.26715: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.26717: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.26820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.26927: done with get_vars() 25052 1726882467.26933: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:34:27 -0400 (0:00:00.018) 0:00:04.224 ****** 25052 1726882467.26976: entering _queue_task() for managed_node2/include_tasks 25052 1726882467.27163: worker is 1 (out of 1 available) 25052 1726882467.27175: exiting _queue_task() for managed_node2/include_tasks 25052 1726882467.27185: done queuing things up, now waiting for results queue to drain 25052 1726882467.27187: waiting for pending results... 25052 1726882467.27342: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25052 1726882467.27388: in run() - task 12673a56-9f93-f7f6-4a6d-00000000012b 25052 1726882467.27401: variable 'ansible_search_path' from source: unknown 25052 1726882467.27405: variable 'ansible_search_path' from source: unknown 25052 1726882467.27436: calling self._execute() 25052 1726882467.27495: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.27499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.27506: variable 'omit' from source: magic vars 25052 1726882467.27761: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.27768: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.27779: _execute() done 25052 1726882467.27783: dumping result to json 25052 1726882467.27786: done dumping result, returning 25052 1726882467.27791: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-f7f6-4a6d-00000000012b] 25052 1726882467.27800: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000012b 25052 1726882467.27874: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000012b 25052 1726882467.27877: WORKER PROCESS EXITING 25052 1726882467.27902: no more pending results, returning what we have 25052 1726882467.27907: in VariableManager get_vars() 25052 1726882467.27947: Calling all_inventory to load vars for managed_node2 25052 1726882467.27949: Calling groups_inventory to load vars for managed_node2 25052 1726882467.27951: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.27960: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.27963: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.27965: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.28082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.28191: done with get_vars() 25052 1726882467.28199: variable 'ansible_search_path' from source: unknown 25052 1726882467.28200: variable 'ansible_search_path' from source: unknown 25052 1726882467.28226: we have included files to process 25052 1726882467.28227: generating all_blocks data 25052 1726882467.28228: done generating all_blocks data 25052 1726882467.28229: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882467.28229: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882467.28230: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882467.28444: done processing included file 25052 1726882467.28445: iterating over new_blocks loaded from include file 25052 1726882467.28447: in VariableManager get_vars() 25052 1726882467.28458: done with get_vars() 25052 1726882467.28459: filtering new block on tags 25052 1726882467.28469: done filtering new block on tags 25052 1726882467.28470: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25052 1726882467.28473: extending task lists for all hosts with included blocks 25052 1726882467.28530: done extending task lists 25052 1726882467.28531: done processing included files 25052 1726882467.28532: results queue empty 25052 1726882467.28532: checking for any_errors_fatal 25052 1726882467.28534: done checking for any_errors_fatal 25052 1726882467.28534: checking for max_fail_percentage 25052 1726882467.28535: done checking for max_fail_percentage 25052 1726882467.28535: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.28536: done checking to see if all hosts have failed 25052 1726882467.28536: getting the remaining hosts for this loop 25052 1726882467.28537: done getting the remaining hosts for this loop 25052 1726882467.28538: getting the next task for host managed_node2 25052 1726882467.28542: done getting next task for host managed_node2 25052 1726882467.28544: ^ task is: TASK: Gather current interface info 25052 1726882467.28547: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.28548: getting variables 25052 1726882467.28549: in VariableManager get_vars() 25052 1726882467.28557: Calling all_inventory to load vars for managed_node2 25052 1726882467.28558: Calling groups_inventory to load vars for managed_node2 25052 1726882467.28559: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.28562: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.28563: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.28565: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.28641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.28748: done with get_vars() 25052 1726882467.28754: done getting variables 25052 1726882467.28781: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:34:27 -0400 (0:00:00.018) 0:00:04.242 ****** 25052 1726882467.28802: entering _queue_task() for managed_node2/command 25052 1726882467.28969: worker is 1 (out of 1 available) 25052 1726882467.28984: exiting _queue_task() for managed_node2/command 25052 1726882467.28996: done queuing things up, now waiting for results queue to drain 25052 1726882467.28997: waiting for pending results... 25052 1726882467.29134: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25052 1726882467.29186: in run() - task 12673a56-9f93-f7f6-4a6d-00000000013a 25052 1726882467.29201: variable 'ansible_search_path' from source: unknown 25052 1726882467.29205: variable 'ansible_search_path' from source: unknown 25052 1726882467.29233: calling self._execute() 25052 1726882467.29285: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.29288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.29302: variable 'omit' from source: magic vars 25052 1726882467.29541: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.29558: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.29561: variable 'omit' from source: magic vars 25052 1726882467.29586: variable 'omit' from source: magic vars 25052 1726882467.29613: variable 'omit' from source: magic vars 25052 1726882467.29643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882467.29671: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882467.29685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882467.29701: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.29712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.29733: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882467.29736: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.29740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.29814: Set connection var ansible_pipelining to False 25052 1726882467.29817: Set connection var ansible_connection to ssh 25052 1726882467.29820: Set connection var ansible_shell_type to sh 25052 1726882467.29825: Set connection var ansible_timeout to 10 25052 1726882467.29831: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882467.29836: Set connection var ansible_shell_executable to /bin/sh 25052 1726882467.29851: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.29854: variable 'ansible_connection' from source: unknown 25052 1726882467.29856: variable 'ansible_module_compression' from source: unknown 25052 1726882467.29859: variable 'ansible_shell_type' from source: unknown 25052 1726882467.29861: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.29863: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.29867: variable 'ansible_pipelining' from source: unknown 25052 1726882467.29871: variable 'ansible_timeout' from source: unknown 25052 1726882467.29873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.29970: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882467.29977: variable 'omit' from source: magic vars 25052 1726882467.29982: starting attempt loop 25052 1726882467.29987: running the handler 25052 1726882467.30005: _low_level_execute_command(): starting 25052 1726882467.30011: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882467.30520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.30524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.30526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.30530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.30570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.30585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.30669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.32931: stdout chunk (state=3): >>>/root <<< 25052 1726882467.33137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.33140: stdout chunk (state=3): >>><<< 25052 1726882467.33143: stderr chunk (state=3): >>><<< 25052 1726882467.33200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882467.33205: _low_level_execute_command(): starting 25052 1726882467.33208: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835 `" && echo ansible-tmp-1726882467.3317156-25261-25151486020835="` echo /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835 `" ) && sleep 0' 25052 1726882467.33899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882467.33919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.33964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.34009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.34016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.34082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.36747: stdout chunk (state=3): >>>ansible-tmp-1726882467.3317156-25261-25151486020835=/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835 <<< 25052 1726882467.36942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.36954: stderr chunk (state=3): >>><<< 25052 1726882467.36979: stdout chunk (state=3): >>><<< 25052 1726882467.37005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882467.3317156-25261-25151486020835=/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882467.37199: variable 'ansible_module_compression' from source: unknown 25052 1726882467.37202: ANSIBALLZ: Using generic lock for ansible.legacy.command 25052 1726882467.37204: ANSIBALLZ: Acquiring lock 25052 1726882467.37206: ANSIBALLZ: Lock acquired: 140207139645744 25052 1726882467.37208: ANSIBALLZ: Creating module 25052 1726882467.50628: ANSIBALLZ: Writing module into payload 25052 1726882467.50799: ANSIBALLZ: Writing module 25052 1726882467.50804: ANSIBALLZ: Renaming module 25052 1726882467.50806: ANSIBALLZ: Done creating module 25052 1726882467.50810: variable 'ansible_facts' from source: unknown 25052 1726882467.50897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py 25052 1726882467.51096: Sending initial data 25052 1726882467.51280: Sent initial data (155 bytes) 25052 1726882467.51912: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.51961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.51980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882467.52008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.52232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.54504: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 25052 1726882467.54524: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882467.54718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882467.54722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpzynqoova /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py <<< 25052 1726882467.54727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py" <<< 25052 1726882467.54861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpzynqoova" to remote "/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py" <<< 25052 1726882467.55705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.55717: stderr chunk (state=3): >>><<< 25052 1726882467.55864: stdout chunk (state=3): >>><<< 25052 1726882467.55867: done transferring module to remote 25052 1726882467.55869: _low_level_execute_command(): starting 25052 1726882467.55872: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/ /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py && sleep 0' 25052 1726882467.56386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882467.56403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882467.56417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.56436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882467.56451: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882467.56466: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882467.56510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.56595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.56598: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882467.56665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.56873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.59227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.59321: stderr chunk (state=3): >>><<< 25052 1726882467.59339: stdout chunk (state=3): >>><<< 25052 1726882467.59357: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882467.59383: _low_level_execute_command(): starting 25052 1726882467.59396: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/AnsiballZ_command.py && sleep 0' 25052 1726882467.59999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882467.60013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882467.60028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.60049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882467.60063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882467.60159: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.60169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882467.60183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.60336: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.81730: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:27.813313", "end": "2024-09-20 21:34:27.816274", "delta": "0:00:00.002961", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882467.83534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882467.83538: stdout chunk (state=3): >>><<< 25052 1726882467.83540: stderr chunk (state=3): >>><<< 25052 1726882467.83543: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:27.813313", "end": "2024-09-20 21:34:27.816274", "delta": "0:00:00.002961", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882467.83545: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882467.83701: _low_level_execute_command(): starting 25052 1726882467.83705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882467.3317156-25261-25151486020835/ > /dev/null 2>&1 && sleep 0' 25052 1726882467.84768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882467.84777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882467.84787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882467.84835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882467.84899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882467.84922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882467.85110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882467.87711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882467.87714: stdout chunk (state=3): >>><<< 25052 1726882467.87717: stderr chunk (state=3): >>><<< 25052 1726882467.87719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882467.87722: handler run complete 25052 1726882467.87724: Evaluated conditional (False): False 25052 1726882467.87730: attempt loop complete, returning result 25052 1726882467.87733: _execute() done 25052 1726882467.87735: dumping result to json 25052 1726882467.87741: done dumping result, returning 25052 1726882467.87749: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-f7f6-4a6d-00000000013a] 25052 1726882467.87819: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000013a 25052 1726882467.87889: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000013a 25052 1726882467.87892: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002961", "end": "2024-09-20 21:34:27.816274", "rc": 0, "start": "2024-09-20 21:34:27.813313" } STDOUT: bonding_masters eth0 lo 25052 1726882467.87988: no more pending results, returning what we have 25052 1726882467.87991: results queue empty 25052 1726882467.87992: checking for any_errors_fatal 25052 1726882467.87995: done checking for any_errors_fatal 25052 1726882467.87996: checking for max_fail_percentage 25052 1726882467.87998: done checking for max_fail_percentage 25052 1726882467.87999: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.87999: done checking to see if all hosts have failed 25052 1726882467.88000: getting the remaining hosts for this loop 25052 1726882467.88002: done getting the remaining hosts for this loop 25052 1726882467.88005: getting the next task for host managed_node2 25052 1726882467.88176: done getting next task for host managed_node2 25052 1726882467.88179: ^ task is: TASK: Set current_interfaces 25052 1726882467.88183: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.88187: getting variables 25052 1726882467.88189: in VariableManager get_vars() 25052 1726882467.88231: Calling all_inventory to load vars for managed_node2 25052 1726882467.88234: Calling groups_inventory to load vars for managed_node2 25052 1726882467.88237: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.88247: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.88250: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.88253: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.88756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.89307: done with get_vars() 25052 1726882467.89318: done getting variables 25052 1726882467.89373: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:34:27 -0400 (0:00:00.606) 0:00:04.849 ****** 25052 1726882467.89448: entering _queue_task() for managed_node2/set_fact 25052 1726882467.89779: worker is 1 (out of 1 available) 25052 1726882467.89791: exiting _queue_task() for managed_node2/set_fact 25052 1726882467.89806: done queuing things up, now waiting for results queue to drain 25052 1726882467.89808: waiting for pending results... 25052 1726882467.90123: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25052 1726882467.90199: in run() - task 12673a56-9f93-f7f6-4a6d-00000000013b 25052 1726882467.90202: variable 'ansible_search_path' from source: unknown 25052 1726882467.90205: variable 'ansible_search_path' from source: unknown 25052 1726882467.90261: calling self._execute() 25052 1726882467.90333: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.90343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.90356: variable 'omit' from source: magic vars 25052 1726882467.90762: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.90766: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.90768: variable 'omit' from source: magic vars 25052 1726882467.90808: variable 'omit' from source: magic vars 25052 1726882467.90935: variable '_current_interfaces' from source: set_fact 25052 1726882467.91049: variable 'omit' from source: magic vars 25052 1726882467.91125: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882467.91179: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882467.91237: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882467.91243: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.91258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.91296: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882467.91343: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.91348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.91469: Set connection var ansible_pipelining to False 25052 1726882467.91482: Set connection var ansible_connection to ssh 25052 1726882467.91639: Set connection var ansible_shell_type to sh 25052 1726882467.91642: Set connection var ansible_timeout to 10 25052 1726882467.91644: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882467.91646: Set connection var ansible_shell_executable to /bin/sh 25052 1726882467.91648: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.91650: variable 'ansible_connection' from source: unknown 25052 1726882467.91652: variable 'ansible_module_compression' from source: unknown 25052 1726882467.91679: variable 'ansible_shell_type' from source: unknown 25052 1726882467.91783: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.91786: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.91788: variable 'ansible_pipelining' from source: unknown 25052 1726882467.91790: variable 'ansible_timeout' from source: unknown 25052 1726882467.91792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.92085: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882467.92300: variable 'omit' from source: magic vars 25052 1726882467.92303: starting attempt loop 25052 1726882467.92305: running the handler 25052 1726882467.92307: handler run complete 25052 1726882467.92309: attempt loop complete, returning result 25052 1726882467.92311: _execute() done 25052 1726882467.92313: dumping result to json 25052 1726882467.92315: done dumping result, returning 25052 1726882467.92318: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-f7f6-4a6d-00000000013b] 25052 1726882467.92320: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000013b 25052 1726882467.92382: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000013b 25052 1726882467.92385: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25052 1726882467.92447: no more pending results, returning what we have 25052 1726882467.92450: results queue empty 25052 1726882467.92451: checking for any_errors_fatal 25052 1726882467.92459: done checking for any_errors_fatal 25052 1726882467.92460: checking for max_fail_percentage 25052 1726882467.92462: done checking for max_fail_percentage 25052 1726882467.92462: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.92463: done checking to see if all hosts have failed 25052 1726882467.92464: getting the remaining hosts for this loop 25052 1726882467.92465: done getting the remaining hosts for this loop 25052 1726882467.92469: getting the next task for host managed_node2 25052 1726882467.92477: done getting next task for host managed_node2 25052 1726882467.92479: ^ task is: TASK: Show current_interfaces 25052 1726882467.92483: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.92487: getting variables 25052 1726882467.92488: in VariableManager get_vars() 25052 1726882467.92638: Calling all_inventory to load vars for managed_node2 25052 1726882467.92641: Calling groups_inventory to load vars for managed_node2 25052 1726882467.92644: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.92655: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.92657: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.92660: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.93248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.93573: done with get_vars() 25052 1726882467.93585: done getting variables 25052 1726882467.93687: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:34:27 -0400 (0:00:00.042) 0:00:04.892 ****** 25052 1726882467.93721: entering _queue_task() for managed_node2/debug 25052 1726882467.93723: Creating lock for debug 25052 1726882467.94043: worker is 1 (out of 1 available) 25052 1726882467.94055: exiting _queue_task() for managed_node2/debug 25052 1726882467.94068: done queuing things up, now waiting for results queue to drain 25052 1726882467.94069: waiting for pending results... 25052 1726882467.94423: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25052 1726882467.94499: in run() - task 12673a56-9f93-f7f6-4a6d-00000000012c 25052 1726882467.94504: variable 'ansible_search_path' from source: unknown 25052 1726882467.94507: variable 'ansible_search_path' from source: unknown 25052 1726882467.94509: calling self._execute() 25052 1726882467.94598: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.94611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.94630: variable 'omit' from source: magic vars 25052 1726882467.95195: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.95219: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.95289: variable 'omit' from source: magic vars 25052 1726882467.95309: variable 'omit' from source: magic vars 25052 1726882467.95423: variable 'current_interfaces' from source: set_fact 25052 1726882467.95457: variable 'omit' from source: magic vars 25052 1726882467.95513: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882467.95599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882467.95604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882467.95606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.95616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882467.95654: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882467.95663: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.95671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.95778: Set connection var ansible_pipelining to False 25052 1726882467.95786: Set connection var ansible_connection to ssh 25052 1726882467.95795: Set connection var ansible_shell_type to sh 25052 1726882467.95808: Set connection var ansible_timeout to 10 25052 1726882467.95836: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882467.95840: Set connection var ansible_shell_executable to /bin/sh 25052 1726882467.95866: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.95946: variable 'ansible_connection' from source: unknown 25052 1726882467.95949: variable 'ansible_module_compression' from source: unknown 25052 1726882467.95952: variable 'ansible_shell_type' from source: unknown 25052 1726882467.95954: variable 'ansible_shell_executable' from source: unknown 25052 1726882467.95956: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.95957: variable 'ansible_pipelining' from source: unknown 25052 1726882467.95959: variable 'ansible_timeout' from source: unknown 25052 1726882467.95961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.96073: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882467.96087: variable 'omit' from source: magic vars 25052 1726882467.96103: starting attempt loop 25052 1726882467.96110: running the handler 25052 1726882467.96578: handler run complete 25052 1726882467.96582: attempt loop complete, returning result 25052 1726882467.96585: _execute() done 25052 1726882467.96586: dumping result to json 25052 1726882467.96588: done dumping result, returning 25052 1726882467.96591: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-f7f6-4a6d-00000000012c] 25052 1726882467.96596: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000012c 25052 1726882467.96677: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000012c 25052 1726882467.96681: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25052 1726882467.96737: no more pending results, returning what we have 25052 1726882467.96741: results queue empty 25052 1726882467.96741: checking for any_errors_fatal 25052 1726882467.96746: done checking for any_errors_fatal 25052 1726882467.96747: checking for max_fail_percentage 25052 1726882467.96749: done checking for max_fail_percentage 25052 1726882467.96749: checking to see if all hosts have failed and the running result is not ok 25052 1726882467.96750: done checking to see if all hosts have failed 25052 1726882467.96751: getting the remaining hosts for this loop 25052 1726882467.96752: done getting the remaining hosts for this loop 25052 1726882467.96756: getting the next task for host managed_node2 25052 1726882467.96764: done getting next task for host managed_node2 25052 1726882467.96768: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25052 1726882467.96770: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882467.96775: getting variables 25052 1726882467.96776: in VariableManager get_vars() 25052 1726882467.96821: Calling all_inventory to load vars for managed_node2 25052 1726882467.96825: Calling groups_inventory to load vars for managed_node2 25052 1726882467.96827: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.96837: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.96839: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.96842: Calling groups_plugins_play to load vars for managed_node2 25052 1726882467.97557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882467.97960: done with get_vars() 25052 1726882467.97970: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:11 Friday 20 September 2024 21:34:27 -0400 (0:00:00.044) 0:00:04.936 ****** 25052 1726882467.98160: entering _queue_task() for managed_node2/include_tasks 25052 1726882467.98659: worker is 1 (out of 1 available) 25052 1726882467.98671: exiting _queue_task() for managed_node2/include_tasks 25052 1726882467.98682: done queuing things up, now waiting for results queue to drain 25052 1726882467.98683: waiting for pending results... 25052 1726882467.99110: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 25052 1726882467.99115: in run() - task 12673a56-9f93-f7f6-4a6d-00000000000c 25052 1726882467.99118: variable 'ansible_search_path' from source: unknown 25052 1726882467.99121: calling self._execute() 25052 1726882467.99151: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882467.99161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882467.99173: variable 'omit' from source: magic vars 25052 1726882467.99528: variable 'ansible_distribution_major_version' from source: facts 25052 1726882467.99545: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882467.99557: _execute() done 25052 1726882467.99568: dumping result to json 25052 1726882467.99574: done dumping result, returning 25052 1726882467.99584: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-f7f6-4a6d-00000000000c] 25052 1726882467.99596: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000c 25052 1726882467.99820: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000c 25052 1726882467.99824: WORKER PROCESS EXITING 25052 1726882467.99847: no more pending results, returning what we have 25052 1726882467.99852: in VariableManager get_vars() 25052 1726882467.99898: Calling all_inventory to load vars for managed_node2 25052 1726882467.99901: Calling groups_inventory to load vars for managed_node2 25052 1726882467.99903: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882467.99914: Calling all_plugins_play to load vars for managed_node2 25052 1726882467.99917: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882467.99920: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.00185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.00385: done with get_vars() 25052 1726882468.00395: variable 'ansible_search_path' from source: unknown 25052 1726882468.00409: we have included files to process 25052 1726882468.00411: generating all_blocks data 25052 1726882468.00415: done generating all_blocks data 25052 1726882468.00420: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882468.00421: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882468.00430: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882468.01055: in VariableManager get_vars() 25052 1726882468.01075: done with get_vars() 25052 1726882468.01282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 25052 1726882468.01836: done processing included file 25052 1726882468.01838: iterating over new_blocks loaded from include file 25052 1726882468.01839: in VariableManager get_vars() 25052 1726882468.01855: done with get_vars() 25052 1726882468.01856: filtering new block on tags 25052 1726882468.01900: done filtering new block on tags 25052 1726882468.01903: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 25052 1726882468.01917: extending task lists for all hosts with included blocks 25052 1726882468.02071: done extending task lists 25052 1726882468.02072: done processing included files 25052 1726882468.02073: results queue empty 25052 1726882468.02073: checking for any_errors_fatal 25052 1726882468.02077: done checking for any_errors_fatal 25052 1726882468.02077: checking for max_fail_percentage 25052 1726882468.02078: done checking for max_fail_percentage 25052 1726882468.02079: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.02080: done checking to see if all hosts have failed 25052 1726882468.02081: getting the remaining hosts for this loop 25052 1726882468.02082: done getting the remaining hosts for this loop 25052 1726882468.02084: getting the next task for host managed_node2 25052 1726882468.02088: done getting next task for host managed_node2 25052 1726882468.02090: ^ task is: TASK: Ensure state in ["present", "absent"] 25052 1726882468.02096: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.02099: getting variables 25052 1726882468.02100: in VariableManager get_vars() 25052 1726882468.02115: Calling all_inventory to load vars for managed_node2 25052 1726882468.02117: Calling groups_inventory to load vars for managed_node2 25052 1726882468.02119: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.02124: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.02127: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.02130: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.02265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.02451: done with get_vars() 25052 1726882468.02460: done getting variables 25052 1726882468.02524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:34:28 -0400 (0:00:00.043) 0:00:04.980 ****** 25052 1726882468.02550: entering _queue_task() for managed_node2/fail 25052 1726882468.02552: Creating lock for fail 25052 1726882468.02823: worker is 1 (out of 1 available) 25052 1726882468.02836: exiting _queue_task() for managed_node2/fail 25052 1726882468.02848: done queuing things up, now waiting for results queue to drain 25052 1726882468.02849: waiting for pending results... 25052 1726882468.03107: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 25052 1726882468.03214: in run() - task 12673a56-9f93-f7f6-4a6d-000000000156 25052 1726882468.03300: variable 'ansible_search_path' from source: unknown 25052 1726882468.03304: variable 'ansible_search_path' from source: unknown 25052 1726882468.03308: calling self._execute() 25052 1726882468.03360: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.03371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.03387: variable 'omit' from source: magic vars 25052 1726882468.03741: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.03764: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.03897: variable 'state' from source: include params 25052 1726882468.03969: Evaluated conditional (state not in ["present", "absent"]): False 25052 1726882468.03972: when evaluation is False, skipping this task 25052 1726882468.03974: _execute() done 25052 1726882468.03976: dumping result to json 25052 1726882468.03978: done dumping result, returning 25052 1726882468.03980: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-f7f6-4a6d-000000000156] 25052 1726882468.03982: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000156 25052 1726882468.04048: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000156 25052 1726882468.04050: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25052 1726882468.04123: no more pending results, returning what we have 25052 1726882468.04126: results queue empty 25052 1726882468.04127: checking for any_errors_fatal 25052 1726882468.04129: done checking for any_errors_fatal 25052 1726882468.04130: checking for max_fail_percentage 25052 1726882468.04131: done checking for max_fail_percentage 25052 1726882468.04132: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.04133: done checking to see if all hosts have failed 25052 1726882468.04134: getting the remaining hosts for this loop 25052 1726882468.04135: done getting the remaining hosts for this loop 25052 1726882468.04138: getting the next task for host managed_node2 25052 1726882468.04144: done getting next task for host managed_node2 25052 1726882468.04146: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25052 1726882468.04150: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.04154: getting variables 25052 1726882468.04156: in VariableManager get_vars() 25052 1726882468.04197: Calling all_inventory to load vars for managed_node2 25052 1726882468.04200: Calling groups_inventory to load vars for managed_node2 25052 1726882468.04203: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.04215: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.04217: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.04220: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.04591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.04789: done with get_vars() 25052 1726882468.04802: done getting variables 25052 1726882468.04854: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:34:28 -0400 (0:00:00.023) 0:00:05.003 ****** 25052 1726882468.04880: entering _queue_task() for managed_node2/fail 25052 1726882468.05301: worker is 1 (out of 1 available) 25052 1726882468.05309: exiting _queue_task() for managed_node2/fail 25052 1726882468.05317: done queuing things up, now waiting for results queue to drain 25052 1726882468.05319: waiting for pending results... 25052 1726882468.05444: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 25052 1726882468.05458: in run() - task 12673a56-9f93-f7f6-4a6d-000000000157 25052 1726882468.05475: variable 'ansible_search_path' from source: unknown 25052 1726882468.05482: variable 'ansible_search_path' from source: unknown 25052 1726882468.05520: calling self._execute() 25052 1726882468.05604: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.05614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.05628: variable 'omit' from source: magic vars 25052 1726882468.05976: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.05998: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.06196: variable 'type' from source: play vars 25052 1726882468.06204: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25052 1726882468.06206: when evaluation is False, skipping this task 25052 1726882468.06209: _execute() done 25052 1726882468.06211: dumping result to json 25052 1726882468.06213: done dumping result, returning 25052 1726882468.06215: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-f7f6-4a6d-000000000157] 25052 1726882468.06217: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000157 skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25052 1726882468.06339: no more pending results, returning what we have 25052 1726882468.06343: results queue empty 25052 1726882468.06344: checking for any_errors_fatal 25052 1726882468.06349: done checking for any_errors_fatal 25052 1726882468.06350: checking for max_fail_percentage 25052 1726882468.06352: done checking for max_fail_percentage 25052 1726882468.06352: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.06353: done checking to see if all hosts have failed 25052 1726882468.06354: getting the remaining hosts for this loop 25052 1726882468.06355: done getting the remaining hosts for this loop 25052 1726882468.06358: getting the next task for host managed_node2 25052 1726882468.06364: done getting next task for host managed_node2 25052 1726882468.06367: ^ task is: TASK: Include the task 'show_interfaces.yml' 25052 1726882468.06370: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.06374: getting variables 25052 1726882468.06375: in VariableManager get_vars() 25052 1726882468.06417: Calling all_inventory to load vars for managed_node2 25052 1726882468.06421: Calling groups_inventory to load vars for managed_node2 25052 1726882468.06424: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.06436: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.06438: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.06441: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.06787: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000157 25052 1726882468.06790: WORKER PROCESS EXITING 25052 1726882468.06816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.07017: done with get_vars() 25052 1726882468.07026: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:34:28 -0400 (0:00:00.022) 0:00:05.025 ****** 25052 1726882468.07109: entering _queue_task() for managed_node2/include_tasks 25052 1726882468.07331: worker is 1 (out of 1 available) 25052 1726882468.07342: exiting _queue_task() for managed_node2/include_tasks 25052 1726882468.07353: done queuing things up, now waiting for results queue to drain 25052 1726882468.07354: waiting for pending results... 25052 1726882468.07568: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25052 1726882468.07683: in run() - task 12673a56-9f93-f7f6-4a6d-000000000158 25052 1726882468.07709: variable 'ansible_search_path' from source: unknown 25052 1726882468.07717: variable 'ansible_search_path' from source: unknown 25052 1726882468.07754: calling self._execute() 25052 1726882468.07847: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.07857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.07871: variable 'omit' from source: magic vars 25052 1726882468.08606: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.08624: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.08636: _execute() done 25052 1726882468.08644: dumping result to json 25052 1726882468.08651: done dumping result, returning 25052 1726882468.08662: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-f7f6-4a6d-000000000158] 25052 1726882468.08675: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000158 25052 1726882468.08803: no more pending results, returning what we have 25052 1726882468.08807: in VariableManager get_vars() 25052 1726882468.08854: Calling all_inventory to load vars for managed_node2 25052 1726882468.08857: Calling groups_inventory to load vars for managed_node2 25052 1726882468.08860: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.08874: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.08877: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.08880: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.09525: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000158 25052 1726882468.09529: WORKER PROCESS EXITING 25052 1726882468.09550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.09731: done with get_vars() 25052 1726882468.09738: variable 'ansible_search_path' from source: unknown 25052 1726882468.09739: variable 'ansible_search_path' from source: unknown 25052 1726882468.09772: we have included files to process 25052 1726882468.09773: generating all_blocks data 25052 1726882468.09774: done generating all_blocks data 25052 1726882468.09777: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882468.09778: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882468.09780: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882468.09877: in VariableManager get_vars() 25052 1726882468.09906: done with get_vars() 25052 1726882468.10015: done processing included file 25052 1726882468.10017: iterating over new_blocks loaded from include file 25052 1726882468.10019: in VariableManager get_vars() 25052 1726882468.10038: done with get_vars() 25052 1726882468.10039: filtering new block on tags 25052 1726882468.10056: done filtering new block on tags 25052 1726882468.10059: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25052 1726882468.10064: extending task lists for all hosts with included blocks 25052 1726882468.10431: done extending task lists 25052 1726882468.10432: done processing included files 25052 1726882468.10433: results queue empty 25052 1726882468.10434: checking for any_errors_fatal 25052 1726882468.10436: done checking for any_errors_fatal 25052 1726882468.10436: checking for max_fail_percentage 25052 1726882468.10437: done checking for max_fail_percentage 25052 1726882468.10438: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.10439: done checking to see if all hosts have failed 25052 1726882468.10439: getting the remaining hosts for this loop 25052 1726882468.10440: done getting the remaining hosts for this loop 25052 1726882468.10442: getting the next task for host managed_node2 25052 1726882468.10446: done getting next task for host managed_node2 25052 1726882468.10448: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25052 1726882468.10451: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.10453: getting variables 25052 1726882468.10454: in VariableManager get_vars() 25052 1726882468.10465: Calling all_inventory to load vars for managed_node2 25052 1726882468.10467: Calling groups_inventory to load vars for managed_node2 25052 1726882468.10468: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.10473: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.10475: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.10477: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.10635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.10826: done with get_vars() 25052 1726882468.10834: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:34:28 -0400 (0:00:00.037) 0:00:05.063 ****** 25052 1726882468.10899: entering _queue_task() for managed_node2/include_tasks 25052 1726882468.11174: worker is 1 (out of 1 available) 25052 1726882468.11185: exiting _queue_task() for managed_node2/include_tasks 25052 1726882468.11401: done queuing things up, now waiting for results queue to drain 25052 1726882468.11403: waiting for pending results... 25052 1726882468.11450: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25052 1726882468.11570: in run() - task 12673a56-9f93-f7f6-4a6d-00000000017f 25052 1726882468.11591: variable 'ansible_search_path' from source: unknown 25052 1726882468.11604: variable 'ansible_search_path' from source: unknown 25052 1726882468.11699: calling self._execute() 25052 1726882468.11742: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.11753: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.11769: variable 'omit' from source: magic vars 25052 1726882468.12142: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.12159: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.12175: _execute() done 25052 1726882468.12184: dumping result to json 25052 1726882468.12196: done dumping result, returning 25052 1726882468.12281: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-f7f6-4a6d-00000000017f] 25052 1726882468.12284: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000017f 25052 1726882468.12352: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000017f 25052 1726882468.12355: WORKER PROCESS EXITING 25052 1726882468.12413: no more pending results, returning what we have 25052 1726882468.12420: in VariableManager get_vars() 25052 1726882468.12466: Calling all_inventory to load vars for managed_node2 25052 1726882468.12470: Calling groups_inventory to load vars for managed_node2 25052 1726882468.12473: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.12486: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.12489: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.12496: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.12865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.13069: done with get_vars() 25052 1726882468.13076: variable 'ansible_search_path' from source: unknown 25052 1726882468.13077: variable 'ansible_search_path' from source: unknown 25052 1726882468.13136: we have included files to process 25052 1726882468.13137: generating all_blocks data 25052 1726882468.13139: done generating all_blocks data 25052 1726882468.13141: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882468.13142: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882468.13144: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882468.13403: done processing included file 25052 1726882468.13405: iterating over new_blocks loaded from include file 25052 1726882468.13407: in VariableManager get_vars() 25052 1726882468.13426: done with get_vars() 25052 1726882468.13428: filtering new block on tags 25052 1726882468.13446: done filtering new block on tags 25052 1726882468.13448: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25052 1726882468.13453: extending task lists for all hosts with included blocks 25052 1726882468.13606: done extending task lists 25052 1726882468.13607: done processing included files 25052 1726882468.13608: results queue empty 25052 1726882468.13609: checking for any_errors_fatal 25052 1726882468.13612: done checking for any_errors_fatal 25052 1726882468.13613: checking for max_fail_percentage 25052 1726882468.13614: done checking for max_fail_percentage 25052 1726882468.13614: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.13615: done checking to see if all hosts have failed 25052 1726882468.13616: getting the remaining hosts for this loop 25052 1726882468.13617: done getting the remaining hosts for this loop 25052 1726882468.13620: getting the next task for host managed_node2 25052 1726882468.13624: done getting next task for host managed_node2 25052 1726882468.13626: ^ task is: TASK: Gather current interface info 25052 1726882468.13630: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.13632: getting variables 25052 1726882468.13633: in VariableManager get_vars() 25052 1726882468.13645: Calling all_inventory to load vars for managed_node2 25052 1726882468.13647: Calling groups_inventory to load vars for managed_node2 25052 1726882468.13649: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.13653: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.13655: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.13658: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.13823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.14008: done with get_vars() 25052 1726882468.14017: done getting variables 25052 1726882468.14055: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:34:28 -0400 (0:00:00.031) 0:00:05.095 ****** 25052 1726882468.14083: entering _queue_task() for managed_node2/command 25052 1726882468.14350: worker is 1 (out of 1 available) 25052 1726882468.14365: exiting _queue_task() for managed_node2/command 25052 1726882468.14377: done queuing things up, now waiting for results queue to drain 25052 1726882468.14378: waiting for pending results... 25052 1726882468.14810: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25052 1726882468.14815: in run() - task 12673a56-9f93-f7f6-4a6d-0000000001b6 25052 1726882468.14817: variable 'ansible_search_path' from source: unknown 25052 1726882468.14820: variable 'ansible_search_path' from source: unknown 25052 1726882468.14823: calling self._execute() 25052 1726882468.14867: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.14876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.14888: variable 'omit' from source: magic vars 25052 1726882468.15246: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.15267: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.15277: variable 'omit' from source: magic vars 25052 1726882468.15332: variable 'omit' from source: magic vars 25052 1726882468.15375: variable 'omit' from source: magic vars 25052 1726882468.15421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882468.15459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882468.15486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882468.15512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.15529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.15561: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882468.15570: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.15578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.15684: Set connection var ansible_pipelining to False 25052 1726882468.15700: Set connection var ansible_connection to ssh 25052 1726882468.15707: Set connection var ansible_shell_type to sh 25052 1726882468.15718: Set connection var ansible_timeout to 10 25052 1726882468.15732: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882468.15741: Set connection var ansible_shell_executable to /bin/sh 25052 1726882468.15764: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.15772: variable 'ansible_connection' from source: unknown 25052 1726882468.15799: variable 'ansible_module_compression' from source: unknown 25052 1726882468.15802: variable 'ansible_shell_type' from source: unknown 25052 1726882468.15804: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.15806: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.15808: variable 'ansible_pipelining' from source: unknown 25052 1726882468.15810: variable 'ansible_timeout' from source: unknown 25052 1726882468.15813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.16016: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882468.16021: variable 'omit' from source: magic vars 25052 1726882468.16023: starting attempt loop 25052 1726882468.16026: running the handler 25052 1726882468.16028: _low_level_execute_command(): starting 25052 1726882468.16032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882468.16734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882468.16785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.16862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.16896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.16917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.17046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 25052 1726882468.19443: stdout chunk (state=3): >>>/root <<< 25052 1726882468.19648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.19651: stdout chunk (state=3): >>><<< 25052 1726882468.19653: stderr chunk (state=3): >>><<< 25052 1726882468.19769: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 25052 1726882468.19773: _low_level_execute_command(): starting 25052 1726882468.19777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102 `" && echo ansible-tmp-1726882468.1967852-25309-164469952745102="` echo /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102 `" ) && sleep 0' 25052 1726882468.20278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882468.20378: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882468.20414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.20436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882468.20611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.20641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.20730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.20785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.22714: stdout chunk (state=3): >>>ansible-tmp-1726882468.1967852-25309-164469952745102=/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102 <<< 25052 1726882468.22983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.22986: stdout chunk (state=3): >>><<< 25052 1726882468.22989: stderr chunk (state=3): >>><<< 25052 1726882468.23010: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882468.1967852-25309-164469952745102=/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.23401: variable 'ansible_module_compression' from source: unknown 25052 1726882468.23405: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882468.23407: variable 'ansible_facts' from source: unknown 25052 1726882468.23697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py 25052 1726882468.23820: Sending initial data 25052 1726882468.23824: Sent initial data (156 bytes) 25052 1726882468.24946: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882468.25054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.25067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.25082: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.25174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.27324: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882468.27410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882468.27503: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp11p0iul9 /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py <<< 25052 1726882468.27517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py" <<< 25052 1726882468.27575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp11p0iul9" to remote "/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py" <<< 25052 1726882468.28604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.28664: stderr chunk (state=3): >>><<< 25052 1726882468.28674: stdout chunk (state=3): >>><<< 25052 1726882468.28709: done transferring module to remote 25052 1726882468.28740: _low_level_execute_command(): starting 25052 1726882468.28751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/ /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py && sleep 0' 25052 1726882468.29603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882468.29620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882468.29647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.29710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.29765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.29787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.29806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.29921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.32238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.32277: stderr chunk (state=3): >>><<< 25052 1726882468.32287: stdout chunk (state=3): >>><<< 25052 1726882468.32317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.32325: _low_level_execute_command(): starting 25052 1726882468.32334: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/AnsiballZ_command.py && sleep 0' 25052 1726882468.33881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.33885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.33952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.33980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.34000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.34184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.49398: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:28.489451", "end": "2024-09-20 21:34:28.492576", "delta": "0:00:00.003125", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882468.50760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.50835: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 25052 1726882468.50847: stdout chunk (state=3): >>><<< 25052 1726882468.50858: stderr chunk (state=3): >>><<< 25052 1726882468.50879: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:28.489451", "end": "2024-09-20 21:34:28.492576", "delta": "0:00:00.003125", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882468.50926: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882468.50944: _low_level_execute_command(): starting 25052 1726882468.50958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882468.1967852-25309-164469952745102/ > /dev/null 2>&1 && sleep 0' 25052 1726882468.52028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.52032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.52039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.52168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.54002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.54048: stderr chunk (state=3): >>><<< 25052 1726882468.54059: stdout chunk (state=3): >>><<< 25052 1726882468.54080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.54094: handler run complete 25052 1726882468.54122: Evaluated conditional (False): False 25052 1726882468.54143: attempt loop complete, returning result 25052 1726882468.54151: _execute() done 25052 1726882468.54157: dumping result to json 25052 1726882468.54166: done dumping result, returning 25052 1726882468.54180: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-f7f6-4a6d-0000000001b6] 25052 1726882468.54190: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001b6 25052 1726882468.54613: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001b6 25052 1726882468.54616: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003125", "end": "2024-09-20 21:34:28.492576", "rc": 0, "start": "2024-09-20 21:34:28.489451" } STDOUT: bonding_masters eth0 lo 25052 1726882468.54688: no more pending results, returning what we have 25052 1726882468.54692: results queue empty 25052 1726882468.54695: checking for any_errors_fatal 25052 1726882468.54696: done checking for any_errors_fatal 25052 1726882468.54697: checking for max_fail_percentage 25052 1726882468.54699: done checking for max_fail_percentage 25052 1726882468.54700: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.54700: done checking to see if all hosts have failed 25052 1726882468.54701: getting the remaining hosts for this loop 25052 1726882468.54702: done getting the remaining hosts for this loop 25052 1726882468.54706: getting the next task for host managed_node2 25052 1726882468.54712: done getting next task for host managed_node2 25052 1726882468.54715: ^ task is: TASK: Set current_interfaces 25052 1726882468.54720: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.54724: getting variables 25052 1726882468.54725: in VariableManager get_vars() 25052 1726882468.54764: Calling all_inventory to load vars for managed_node2 25052 1726882468.54767: Calling groups_inventory to load vars for managed_node2 25052 1726882468.54769: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.54779: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.54782: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.54785: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.55333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.55984: done with get_vars() 25052 1726882468.55998: done getting variables 25052 1726882468.56061: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:34:28 -0400 (0:00:00.420) 0:00:05.515 ****** 25052 1726882468.56097: entering _queue_task() for managed_node2/set_fact 25052 1726882468.56864: worker is 1 (out of 1 available) 25052 1726882468.56877: exiting _queue_task() for managed_node2/set_fact 25052 1726882468.56887: done queuing things up, now waiting for results queue to drain 25052 1726882468.56888: waiting for pending results... 25052 1726882468.57409: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25052 1726882468.57414: in run() - task 12673a56-9f93-f7f6-4a6d-0000000001b7 25052 1726882468.57416: variable 'ansible_search_path' from source: unknown 25052 1726882468.57419: variable 'ansible_search_path' from source: unknown 25052 1726882468.57422: calling self._execute() 25052 1726882468.57478: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.57488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.57508: variable 'omit' from source: magic vars 25052 1726882468.58402: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.58422: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.58432: variable 'omit' from source: magic vars 25052 1726882468.58485: variable 'omit' from source: magic vars 25052 1726882468.58999: variable '_current_interfaces' from source: set_fact 25052 1726882468.59002: variable 'omit' from source: magic vars 25052 1726882468.59005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882468.59008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882468.59009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882468.59011: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.59013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.59208: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882468.59216: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.59225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.59332: Set connection var ansible_pipelining to False 25052 1726882468.59505: Set connection var ansible_connection to ssh 25052 1726882468.59512: Set connection var ansible_shell_type to sh 25052 1726882468.59524: Set connection var ansible_timeout to 10 25052 1726882468.59534: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882468.59541: Set connection var ansible_shell_executable to /bin/sh 25052 1726882468.59562: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.59569: variable 'ansible_connection' from source: unknown 25052 1726882468.59575: variable 'ansible_module_compression' from source: unknown 25052 1726882468.59582: variable 'ansible_shell_type' from source: unknown 25052 1726882468.59587: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.59598: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.59605: variable 'ansible_pipelining' from source: unknown 25052 1726882468.59614: variable 'ansible_timeout' from source: unknown 25052 1726882468.59624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.59754: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882468.60200: variable 'omit' from source: magic vars 25052 1726882468.60204: starting attempt loop 25052 1726882468.60206: running the handler 25052 1726882468.60208: handler run complete 25052 1726882468.60211: attempt loop complete, returning result 25052 1726882468.60213: _execute() done 25052 1726882468.60215: dumping result to json 25052 1726882468.60217: done dumping result, returning 25052 1726882468.60219: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-f7f6-4a6d-0000000001b7] 25052 1726882468.60221: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001b7 25052 1726882468.60277: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001b7 25052 1726882468.60281: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 25052 1726882468.60453: no more pending results, returning what we have 25052 1726882468.60455: results queue empty 25052 1726882468.60456: checking for any_errors_fatal 25052 1726882468.60463: done checking for any_errors_fatal 25052 1726882468.60464: checking for max_fail_percentage 25052 1726882468.60465: done checking for max_fail_percentage 25052 1726882468.60466: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.60466: done checking to see if all hosts have failed 25052 1726882468.60467: getting the remaining hosts for this loop 25052 1726882468.60468: done getting the remaining hosts for this loop 25052 1726882468.60471: getting the next task for host managed_node2 25052 1726882468.60478: done getting next task for host managed_node2 25052 1726882468.60480: ^ task is: TASK: Show current_interfaces 25052 1726882468.60483: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.60486: getting variables 25052 1726882468.60487: in VariableManager get_vars() 25052 1726882468.60619: Calling all_inventory to load vars for managed_node2 25052 1726882468.60622: Calling groups_inventory to load vars for managed_node2 25052 1726882468.60624: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.60632: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.60634: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.60637: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.61045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.61547: done with get_vars() 25052 1726882468.61556: done getting variables 25052 1726882468.61654: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:34:28 -0400 (0:00:00.055) 0:00:05.571 ****** 25052 1726882468.61682: entering _queue_task() for managed_node2/debug 25052 1726882468.62443: worker is 1 (out of 1 available) 25052 1726882468.62453: exiting _queue_task() for managed_node2/debug 25052 1726882468.62461: done queuing things up, now waiting for results queue to drain 25052 1726882468.62462: waiting for pending results... 25052 1726882468.62739: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25052 1726882468.63033: in run() - task 12673a56-9f93-f7f6-4a6d-000000000180 25052 1726882468.63055: variable 'ansible_search_path' from source: unknown 25052 1726882468.63063: variable 'ansible_search_path' from source: unknown 25052 1726882468.63144: calling self._execute() 25052 1726882468.63285: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.63354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.63425: variable 'omit' from source: magic vars 25052 1726882468.63772: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.63790: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.63808: variable 'omit' from source: magic vars 25052 1726882468.63857: variable 'omit' from source: magic vars 25052 1726882468.63959: variable 'current_interfaces' from source: set_fact 25052 1726882468.63987: variable 'omit' from source: magic vars 25052 1726882468.64031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882468.64067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882468.64097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882468.64120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.64136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.64168: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882468.64176: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.64183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.64280: Set connection var ansible_pipelining to False 25052 1726882468.64288: Set connection var ansible_connection to ssh 25052 1726882468.64299: Set connection var ansible_shell_type to sh 25052 1726882468.64312: Set connection var ansible_timeout to 10 25052 1726882468.64323: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882468.64331: Set connection var ansible_shell_executable to /bin/sh 25052 1726882468.64354: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.64361: variable 'ansible_connection' from source: unknown 25052 1726882468.64367: variable 'ansible_module_compression' from source: unknown 25052 1726882468.64373: variable 'ansible_shell_type' from source: unknown 25052 1726882468.64379: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.64385: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.64395: variable 'ansible_pipelining' from source: unknown 25052 1726882468.64402: variable 'ansible_timeout' from source: unknown 25052 1726882468.64409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.64543: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882468.64801: variable 'omit' from source: magic vars 25052 1726882468.64805: starting attempt loop 25052 1726882468.64807: running the handler 25052 1726882468.64809: handler run complete 25052 1726882468.64811: attempt loop complete, returning result 25052 1726882468.64813: _execute() done 25052 1726882468.64816: dumping result to json 25052 1726882468.64819: done dumping result, returning 25052 1726882468.64821: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-f7f6-4a6d-000000000180] 25052 1726882468.64823: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000180 25052 1726882468.64887: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000180 25052 1726882468.64891: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 25052 1726882468.64936: no more pending results, returning what we have 25052 1726882468.64939: results queue empty 25052 1726882468.64939: checking for any_errors_fatal 25052 1726882468.64943: done checking for any_errors_fatal 25052 1726882468.64944: checking for max_fail_percentage 25052 1726882468.64945: done checking for max_fail_percentage 25052 1726882468.64946: checking to see if all hosts have failed and the running result is not ok 25052 1726882468.64947: done checking to see if all hosts have failed 25052 1726882468.64948: getting the remaining hosts for this loop 25052 1726882468.64949: done getting the remaining hosts for this loop 25052 1726882468.64952: getting the next task for host managed_node2 25052 1726882468.64959: done getting next task for host managed_node2 25052 1726882468.64961: ^ task is: TASK: Install iproute 25052 1726882468.64964: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882468.64967: getting variables 25052 1726882468.64969: in VariableManager get_vars() 25052 1726882468.65006: Calling all_inventory to load vars for managed_node2 25052 1726882468.65009: Calling groups_inventory to load vars for managed_node2 25052 1726882468.65011: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882468.65019: Calling all_plugins_play to load vars for managed_node2 25052 1726882468.65021: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882468.65024: Calling groups_plugins_play to load vars for managed_node2 25052 1726882468.65242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882468.65449: done with get_vars() 25052 1726882468.65459: done getting variables 25052 1726882468.65513: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:34:28 -0400 (0:00:00.038) 0:00:05.610 ****** 25052 1726882468.65545: entering _queue_task() for managed_node2/package 25052 1726882468.66015: worker is 1 (out of 1 available) 25052 1726882468.66023: exiting _queue_task() for managed_node2/package 25052 1726882468.66033: done queuing things up, now waiting for results queue to drain 25052 1726882468.66034: waiting for pending results... 25052 1726882468.66055: running TaskExecutor() for managed_node2/TASK: Install iproute 25052 1726882468.66151: in run() - task 12673a56-9f93-f7f6-4a6d-000000000159 25052 1726882468.66170: variable 'ansible_search_path' from source: unknown 25052 1726882468.66180: variable 'ansible_search_path' from source: unknown 25052 1726882468.66217: calling self._execute() 25052 1726882468.66308: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.66318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.66332: variable 'omit' from source: magic vars 25052 1726882468.66690: variable 'ansible_distribution_major_version' from source: facts 25052 1726882468.66714: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882468.66726: variable 'omit' from source: magic vars 25052 1726882468.66764: variable 'omit' from source: magic vars 25052 1726882468.66963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882468.69136: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882468.69230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882468.69271: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882468.69317: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882468.69353: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882468.69455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882468.69488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882468.69525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882468.69575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882468.69597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882468.69705: variable '__network_is_ostree' from source: set_fact 25052 1726882468.69717: variable 'omit' from source: magic vars 25052 1726882468.69758: variable 'omit' from source: magic vars 25052 1726882468.69852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882468.69856: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882468.69858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882468.69865: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.69885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882468.69922: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882468.69932: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.69940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.70054: Set connection var ansible_pipelining to False 25052 1726882468.70066: Set connection var ansible_connection to ssh 25052 1726882468.70075: Set connection var ansible_shell_type to sh 25052 1726882468.70096: Set connection var ansible_timeout to 10 25052 1726882468.70179: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882468.70182: Set connection var ansible_shell_executable to /bin/sh 25052 1726882468.70184: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.70187: variable 'ansible_connection' from source: unknown 25052 1726882468.70189: variable 'ansible_module_compression' from source: unknown 25052 1726882468.70191: variable 'ansible_shell_type' from source: unknown 25052 1726882468.70195: variable 'ansible_shell_executable' from source: unknown 25052 1726882468.70197: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882468.70199: variable 'ansible_pipelining' from source: unknown 25052 1726882468.70200: variable 'ansible_timeout' from source: unknown 25052 1726882468.70202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882468.70292: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882468.70313: variable 'omit' from source: magic vars 25052 1726882468.70324: starting attempt loop 25052 1726882468.70330: running the handler 25052 1726882468.70339: variable 'ansible_facts' from source: unknown 25052 1726882468.70397: variable 'ansible_facts' from source: unknown 25052 1726882468.70400: _low_level_execute_command(): starting 25052 1726882468.70402: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882468.71091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882468.71162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.71200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.71220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.71244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.71340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.73237: stdout chunk (state=3): >>>/root <<< 25052 1726882468.73241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.73243: stdout chunk (state=3): >>><<< 25052 1726882468.73245: stderr chunk (state=3): >>><<< 25052 1726882468.73248: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.73259: _low_level_execute_command(): starting 25052 1726882468.73261: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324 `" && echo ansible-tmp-1726882468.7321393-25339-36545638463324="` echo /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324 `" ) && sleep 0' 25052 1726882468.73900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882468.73915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882468.73929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.73946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882468.73960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882468.74007: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.74090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.74128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.74191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.76057: stdout chunk (state=3): >>>ansible-tmp-1726882468.7321393-25339-36545638463324=/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324 <<< 25052 1726882468.76301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.76304: stdout chunk (state=3): >>><<< 25052 1726882468.76306: stderr chunk (state=3): >>><<< 25052 1726882468.76308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882468.7321393-25339-36545638463324=/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.76311: variable 'ansible_module_compression' from source: unknown 25052 1726882468.76356: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 25052 1726882468.76360: ANSIBALLZ: Acquiring lock 25052 1726882468.76363: ANSIBALLZ: Lock acquired: 140207139645744 25052 1726882468.76365: ANSIBALLZ: Creating module 25052 1726882468.93815: ANSIBALLZ: Writing module into payload 25052 1726882468.93949: ANSIBALLZ: Writing module 25052 1726882468.93966: ANSIBALLZ: Renaming module 25052 1726882468.93977: ANSIBALLZ: Done creating module 25052 1726882468.93996: variable 'ansible_facts' from source: unknown 25052 1726882468.94056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py 25052 1726882468.94158: Sending initial data 25052 1726882468.94161: Sent initial data (151 bytes) 25052 1726882468.94585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.94620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882468.94623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882468.94625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.94628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882468.94630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.94632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.94682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882468.94685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882468.94690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.94750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.96323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25052 1726882468.96330: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882468.96378: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882468.96441: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp3vqiyrev /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py <<< 25052 1726882468.96444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py" <<< 25052 1726882468.96498: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp3vqiyrev" to remote "/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py" <<< 25052 1726882468.97242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.97277: stderr chunk (state=3): >>><<< 25052 1726882468.97280: stdout chunk (state=3): >>><<< 25052 1726882468.97324: done transferring module to remote 25052 1726882468.97335: _low_level_execute_command(): starting 25052 1726882468.97340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/ /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py && sleep 0' 25052 1726882468.97787: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882468.97872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.97876: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882468.97921: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882468.97972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882468.99689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882468.99716: stderr chunk (state=3): >>><<< 25052 1726882468.99719: stdout chunk (state=3): >>><<< 25052 1726882468.99732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882468.99734: _low_level_execute_command(): starting 25052 1726882468.99739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/AnsiballZ_dnf.py && sleep 0' 25052 1726882469.00150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.00153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.00156: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882469.00158: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.00160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.00213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.00219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.00289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.40776: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25052 1726882469.45003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882469.45007: stdout chunk (state=3): >>><<< 25052 1726882469.45010: stderr chunk (state=3): >>><<< 25052 1726882469.45111: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882469.45119: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882469.45122: _low_level_execute_command(): starting 25052 1726882469.45124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882468.7321393-25339-36545638463324/ > /dev/null 2>&1 && sleep 0' 25052 1726882469.45723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882469.45738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.45768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882469.45779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.45808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.45820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882469.45878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.45914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.45929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882469.45949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.46042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.47998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.48002: stderr chunk (state=3): >>><<< 25052 1726882469.48004: stdout chunk (state=3): >>><<< 25052 1726882469.48006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882469.48009: handler run complete 25052 1726882469.48368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882469.48560: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882469.48609: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882469.48639: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882469.48667: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882469.48743: variable '__install_status' from source: unknown 25052 1726882469.48762: Evaluated conditional (__install_status is success): True 25052 1726882469.48779: attempt loop complete, returning result 25052 1726882469.48851: _execute() done 25052 1726882469.48917: dumping result to json 25052 1726882469.48925: done dumping result, returning 25052 1726882469.48933: done running TaskExecutor() for managed_node2/TASK: Install iproute [12673a56-9f93-f7f6-4a6d-000000000159] 25052 1726882469.48938: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000159 25052 1726882469.49043: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000159 25052 1726882469.49046: WORKER PROCESS EXITING ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25052 1726882469.49364: no more pending results, returning what we have 25052 1726882469.49368: results queue empty 25052 1726882469.49369: checking for any_errors_fatal 25052 1726882469.49373: done checking for any_errors_fatal 25052 1726882469.49374: checking for max_fail_percentage 25052 1726882469.49376: done checking for max_fail_percentage 25052 1726882469.49376: checking to see if all hosts have failed and the running result is not ok 25052 1726882469.49377: done checking to see if all hosts have failed 25052 1726882469.49378: getting the remaining hosts for this loop 25052 1726882469.49379: done getting the remaining hosts for this loop 25052 1726882469.49383: getting the next task for host managed_node2 25052 1726882469.49389: done getting next task for host managed_node2 25052 1726882469.49391: ^ task is: TASK: Create veth interface {{ interface }} 25052 1726882469.49396: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882469.49400: getting variables 25052 1726882469.49401: in VariableManager get_vars() 25052 1726882469.49440: Calling all_inventory to load vars for managed_node2 25052 1726882469.49443: Calling groups_inventory to load vars for managed_node2 25052 1726882469.49445: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882469.49455: Calling all_plugins_play to load vars for managed_node2 25052 1726882469.49459: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882469.49462: Calling groups_plugins_play to load vars for managed_node2 25052 1726882469.49759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882469.50041: done with get_vars() 25052 1726882469.50052: done getting variables 25052 1726882469.50114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882469.50238: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:34:29 -0400 (0:00:00.847) 0:00:06.457 ****** 25052 1726882469.50285: entering _queue_task() for managed_node2/command 25052 1726882469.50562: worker is 1 (out of 1 available) 25052 1726882469.50575: exiting _queue_task() for managed_node2/command 25052 1726882469.50586: done queuing things up, now waiting for results queue to drain 25052 1726882469.50587: waiting for pending results... 25052 1726882469.51210: running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 25052 1726882469.51216: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015a 25052 1726882469.51219: variable 'ansible_search_path' from source: unknown 25052 1726882469.51222: variable 'ansible_search_path' from source: unknown 25052 1726882469.51777: variable 'interface' from source: play vars 25052 1726882469.51969: variable 'interface' from source: play vars 25052 1726882469.52036: variable 'interface' from source: play vars 25052 1726882469.52390: Loaded config def from plugin (lookup/items) 25052 1726882469.52599: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25052 1726882469.52604: variable 'omit' from source: magic vars 25052 1726882469.52899: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.52903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.52906: variable 'omit' from source: magic vars 25052 1726882469.53527: variable 'ansible_distribution_major_version' from source: facts 25052 1726882469.53608: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882469.54028: variable 'type' from source: play vars 25052 1726882469.54032: variable 'state' from source: include params 25052 1726882469.54039: variable 'interface' from source: play vars 25052 1726882469.54047: variable 'current_interfaces' from source: set_fact 25052 1726882469.54058: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25052 1726882469.54069: variable 'omit' from source: magic vars 25052 1726882469.54155: variable 'omit' from source: magic vars 25052 1726882469.54300: variable 'item' from source: unknown 25052 1726882469.54388: variable 'item' from source: unknown 25052 1726882469.54481: variable 'omit' from source: magic vars 25052 1726882469.54521: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882469.54553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882469.54789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882469.54792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882469.54796: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882469.54798: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882469.54800: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.54802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.54967: Set connection var ansible_pipelining to False 25052 1726882469.54974: Set connection var ansible_connection to ssh 25052 1726882469.54980: Set connection var ansible_shell_type to sh 25052 1726882469.54990: Set connection var ansible_timeout to 10 25052 1726882469.55015: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882469.55030: Set connection var ansible_shell_executable to /bin/sh 25052 1726882469.55198: variable 'ansible_shell_executable' from source: unknown 25052 1726882469.55201: variable 'ansible_connection' from source: unknown 25052 1726882469.55203: variable 'ansible_module_compression' from source: unknown 25052 1726882469.55206: variable 'ansible_shell_type' from source: unknown 25052 1726882469.55208: variable 'ansible_shell_executable' from source: unknown 25052 1726882469.55210: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.55212: variable 'ansible_pipelining' from source: unknown 25052 1726882469.55214: variable 'ansible_timeout' from source: unknown 25052 1726882469.55219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.55460: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882469.55467: variable 'omit' from source: magic vars 25052 1726882469.55477: starting attempt loop 25052 1726882469.55683: running the handler 25052 1726882469.55686: _low_level_execute_command(): starting 25052 1726882469.55688: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882469.56956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.56960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882469.56962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.56965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.56967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.57208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.57258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.58985: stdout chunk (state=3): >>>/root <<< 25052 1726882469.58998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.59029: stderr chunk (state=3): >>><<< 25052 1726882469.59060: stdout chunk (state=3): >>><<< 25052 1726882469.59273: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882469.59277: _low_level_execute_command(): starting 25052 1726882469.59288: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276 `" && echo ansible-tmp-1726882469.591812-25369-230548781941276="` echo /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276 `" ) && sleep 0' 25052 1726882469.60476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.60662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.60728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.60898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.62699: stdout chunk (state=3): >>>ansible-tmp-1726882469.591812-25369-230548781941276=/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276 <<< 25052 1726882469.62854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.62902: stdout chunk (state=3): >>><<< 25052 1726882469.62978: stderr chunk (state=3): >>><<< 25052 1726882469.63316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882469.591812-25369-230548781941276=/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882469.63319: variable 'ansible_module_compression' from source: unknown 25052 1726882469.63322: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882469.63324: variable 'ansible_facts' from source: unknown 25052 1726882469.63530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py 25052 1726882469.63857: Sending initial data 25052 1726882469.64064: Sent initial data (155 bytes) 25052 1726882469.65200: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.65203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882469.65206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.65208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.65210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.65298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.65601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.67149: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882469.67521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpq7ybhsem" to remote "/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py" <<< 25052 1726882469.67525: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpq7ybhsem /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py <<< 25052 1726882469.68910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.68965: stderr chunk (state=3): >>><<< 25052 1726882469.69008: stdout chunk (state=3): >>><<< 25052 1726882469.69195: done transferring module to remote 25052 1726882469.69214: _low_level_execute_command(): starting 25052 1726882469.69229: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/ /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py && sleep 0' 25052 1726882469.70751: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.70754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.70763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882469.70765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882469.70767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.71120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.71127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.73099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.73104: stdout chunk (state=3): >>><<< 25052 1726882469.73106: stderr chunk (state=3): >>><<< 25052 1726882469.73109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882469.73111: _low_level_execute_command(): starting 25052 1726882469.73113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/AnsiballZ_command.py && sleep 0' 25052 1726882469.74077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882469.74097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.74101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.74115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882469.74126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882469.74132: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882469.74163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.74166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882469.74169: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882469.74171: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882469.74173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.74176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.74188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882469.74305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882469.74312: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882469.74314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.74316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.74322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882469.74324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.74401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.90049: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:34:29.893815", "end": "2024-09-20 21:34:29.898596", "delta": "0:00:00.004781", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882469.92259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882469.92280: stderr chunk (state=3): >>><<< 25052 1726882469.92283: stdout chunk (state=3): >>><<< 25052 1726882469.92300: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0"], "start": "2024-09-20 21:34:29.893815", "end": "2024-09-20 21:34:29.898596", "delta": "0:00:00.004781", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add veth0 type veth peer name peerveth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882469.92336: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add veth0 type veth peer name peerveth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882469.92345: _low_level_execute_command(): starting 25052 1726882469.92348: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882469.591812-25369-230548781941276/ > /dev/null 2>&1 && sleep 0' 25052 1726882469.92766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.92770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.92772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.92774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882469.92777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.92827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882469.92831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.92955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882469.97683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882469.97712: stderr chunk (state=3): >>><<< 25052 1726882469.97717: stdout chunk (state=3): >>><<< 25052 1726882469.97732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882469.97737: handler run complete 25052 1726882469.97754: Evaluated conditional (False): False 25052 1726882469.97762: attempt loop complete, returning result 25052 1726882469.97777: variable 'item' from source: unknown 25052 1726882469.97844: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "veth0", "type", "veth", "peer", "name", "peerveth0" ], "delta": "0:00:00.004781", "end": "2024-09-20 21:34:29.898596", "item": "ip link add veth0 type veth peer name peerveth0", "rc": 0, "start": "2024-09-20 21:34:29.893815" } 25052 1726882469.98008: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.98012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.98014: variable 'omit' from source: magic vars 25052 1726882469.98085: variable 'ansible_distribution_major_version' from source: facts 25052 1726882469.98088: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882469.98213: variable 'type' from source: play vars 25052 1726882469.98216: variable 'state' from source: include params 25052 1726882469.98219: variable 'interface' from source: play vars 25052 1726882469.98223: variable 'current_interfaces' from source: set_fact 25052 1726882469.98229: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25052 1726882469.98238: variable 'omit' from source: magic vars 25052 1726882469.98248: variable 'omit' from source: magic vars 25052 1726882469.98273: variable 'item' from source: unknown 25052 1726882469.98320: variable 'item' from source: unknown 25052 1726882469.98331: variable 'omit' from source: magic vars 25052 1726882469.98350: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882469.98358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882469.98364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882469.98374: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882469.98377: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.98379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.98432: Set connection var ansible_pipelining to False 25052 1726882469.98436: Set connection var ansible_connection to ssh 25052 1726882469.98438: Set connection var ansible_shell_type to sh 25052 1726882469.98443: Set connection var ansible_timeout to 10 25052 1726882469.98454: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882469.98457: Set connection var ansible_shell_executable to /bin/sh 25052 1726882469.98470: variable 'ansible_shell_executable' from source: unknown 25052 1726882469.98473: variable 'ansible_connection' from source: unknown 25052 1726882469.98475: variable 'ansible_module_compression' from source: unknown 25052 1726882469.98477: variable 'ansible_shell_type' from source: unknown 25052 1726882469.98480: variable 'ansible_shell_executable' from source: unknown 25052 1726882469.98482: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882469.98486: variable 'ansible_pipelining' from source: unknown 25052 1726882469.98488: variable 'ansible_timeout' from source: unknown 25052 1726882469.98492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882469.98559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882469.98569: variable 'omit' from source: magic vars 25052 1726882469.98572: starting attempt loop 25052 1726882469.98575: running the handler 25052 1726882469.98581: _low_level_execute_command(): starting 25052 1726882469.98583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882469.99027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882469.99030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.99037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882469.99039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882469.99087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882469.99090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882469.99097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882469.99158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.00741: stdout chunk (state=3): >>>/root <<< 25052 1726882470.00840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.00867: stderr chunk (state=3): >>><<< 25052 1726882470.00871: stdout chunk (state=3): >>><<< 25052 1726882470.00885: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.00896: _low_level_execute_command(): starting 25052 1726882470.00900: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204 `" && echo ansible-tmp-1726882470.0088422-25369-133473720771204="` echo /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204 `" ) && sleep 0' 25052 1726882470.01338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.01341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882470.01344: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.01346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.01348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882470.01350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.01390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.01396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.01466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.03331: stdout chunk (state=3): >>>ansible-tmp-1726882470.0088422-25369-133473720771204=/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204 <<< 25052 1726882470.03441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.03462: stderr chunk (state=3): >>><<< 25052 1726882470.03465: stdout chunk (state=3): >>><<< 25052 1726882470.03480: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882470.0088422-25369-133473720771204=/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.03503: variable 'ansible_module_compression' from source: unknown 25052 1726882470.03531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882470.03546: variable 'ansible_facts' from source: unknown 25052 1726882470.03595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py 25052 1726882470.03682: Sending initial data 25052 1726882470.03685: Sent initial data (156 bytes) 25052 1726882470.04121: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.04124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882470.04127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.04129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.04131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.04181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.04184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.04254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.05788: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25052 1726882470.05797: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882470.05848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882470.05909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpcvglxssh /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py <<< 25052 1726882470.05915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py" <<< 25052 1726882470.05975: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpcvglxssh" to remote "/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py" <<< 25052 1726882470.06578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.06617: stderr chunk (state=3): >>><<< 25052 1726882470.06622: stdout chunk (state=3): >>><<< 25052 1726882470.06656: done transferring module to remote 25052 1726882470.06663: _low_level_execute_command(): starting 25052 1726882470.06668: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/ /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py && sleep 0' 25052 1726882470.07067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.07105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.07108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882470.07110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.07112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.07114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882470.07116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.07159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.07163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.07230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.08946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.08969: stderr chunk (state=3): >>><<< 25052 1726882470.08972: stdout chunk (state=3): >>><<< 25052 1726882470.08986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.08989: _low_level_execute_command(): starting 25052 1726882470.08997: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/AnsiballZ_command.py && sleep 0' 25052 1726882470.09425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.09428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.09430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.09432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.09483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.09489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.09556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.24996: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:34:30.245303", "end": "2024-09-20 21:34:30.248858", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882470.26600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882470.26605: stdout chunk (state=3): >>><<< 25052 1726882470.26607: stderr chunk (state=3): >>><<< 25052 1726882470.26610: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerveth0", "up"], "start": "2024-09-20 21:34:30.245303", "end": "2024-09-20 21:34:30.248858", "delta": "0:00:00.003555", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerveth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882470.26612: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerveth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882470.26614: _low_level_execute_command(): starting 25052 1726882470.26616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882470.0088422-25369-133473720771204/ > /dev/null 2>&1 && sleep 0' 25052 1726882470.27269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.27280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.27306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882470.27339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.27345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.27353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.27410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.27446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.27472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.27574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.29409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.29412: stdout chunk (state=3): >>><<< 25052 1726882470.29598: stderr chunk (state=3): >>><<< 25052 1726882470.29602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.29604: handler run complete 25052 1726882470.29606: Evaluated conditional (False): False 25052 1726882470.29608: attempt loop complete, returning result 25052 1726882470.29610: variable 'item' from source: unknown 25052 1726882470.29612: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerveth0", "up" ], "delta": "0:00:00.003555", "end": "2024-09-20 21:34:30.248858", "item": "ip link set peerveth0 up", "rc": 0, "start": "2024-09-20 21:34:30.245303" } 25052 1726882470.29711: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.29715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.29717: variable 'omit' from source: magic vars 25052 1726882470.29819: variable 'ansible_distribution_major_version' from source: facts 25052 1726882470.29822: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882470.29986: variable 'type' from source: play vars 25052 1726882470.29990: variable 'state' from source: include params 25052 1726882470.29997: variable 'interface' from source: play vars 25052 1726882470.30003: variable 'current_interfaces' from source: set_fact 25052 1726882470.30009: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 25052 1726882470.30013: variable 'omit' from source: magic vars 25052 1726882470.30035: variable 'omit' from source: magic vars 25052 1726882470.30071: variable 'item' from source: unknown 25052 1726882470.30139: variable 'item' from source: unknown 25052 1726882470.30298: variable 'omit' from source: magic vars 25052 1726882470.30301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882470.30308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882470.30311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882470.30313: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882470.30315: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.30317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.30319: Set connection var ansible_pipelining to False 25052 1726882470.30321: Set connection var ansible_connection to ssh 25052 1726882470.30323: Set connection var ansible_shell_type to sh 25052 1726882470.30325: Set connection var ansible_timeout to 10 25052 1726882470.30328: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882470.30330: Set connection var ansible_shell_executable to /bin/sh 25052 1726882470.30332: variable 'ansible_shell_executable' from source: unknown 25052 1726882470.30334: variable 'ansible_connection' from source: unknown 25052 1726882470.30336: variable 'ansible_module_compression' from source: unknown 25052 1726882470.30338: variable 'ansible_shell_type' from source: unknown 25052 1726882470.30339: variable 'ansible_shell_executable' from source: unknown 25052 1726882470.30341: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.30343: variable 'ansible_pipelining' from source: unknown 25052 1726882470.30352: variable 'ansible_timeout' from source: unknown 25052 1726882470.30356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.30461: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882470.30470: variable 'omit' from source: magic vars 25052 1726882470.30475: starting attempt loop 25052 1726882470.30478: running the handler 25052 1726882470.30485: _low_level_execute_command(): starting 25052 1726882470.30487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882470.31075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.31113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882470.31121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882470.31198: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.31216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.31309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.32881: stdout chunk (state=3): >>>/root <<< 25052 1726882470.32975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.33039: stderr chunk (state=3): >>><<< 25052 1726882470.33042: stdout chunk (state=3): >>><<< 25052 1726882470.33139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.33143: _low_level_execute_command(): starting 25052 1726882470.33145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118 `" && echo ansible-tmp-1726882470.330575-25369-92492525850118="` echo /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118 `" ) && sleep 0' 25052 1726882470.33699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.33715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.33738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.33754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.33770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882470.33781: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882470.33812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882470.33824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882470.33842: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.33908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.33941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.33959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.33979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.34076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.36224: stdout chunk (state=3): >>>ansible-tmp-1726882470.330575-25369-92492525850118=/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118 <<< 25052 1726882470.36228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.36231: stdout chunk (state=3): >>><<< 25052 1726882470.36233: stderr chunk (state=3): >>><<< 25052 1726882470.36240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882470.330575-25369-92492525850118=/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.36242: variable 'ansible_module_compression' from source: unknown 25052 1726882470.36635: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882470.36638: variable 'ansible_facts' from source: unknown 25052 1726882470.36701: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py 25052 1726882470.36797: Sending initial data 25052 1726882470.36907: Sent initial data (154 bytes) 25052 1726882470.37421: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.37435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.37449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.37465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.37480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882470.37575: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.37607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.37628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.37721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.39252: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882470.39314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882470.39380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpfsrnr4yx /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py <<< 25052 1726882470.39501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py" <<< 25052 1726882470.39529: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpfsrnr4yx" to remote "/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py" <<< 25052 1726882470.41003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.41237: stderr chunk (state=3): >>><<< 25052 1726882470.41240: stdout chunk (state=3): >>><<< 25052 1726882470.41248: done transferring module to remote 25052 1726882470.41261: _low_level_execute_command(): starting 25052 1726882470.41405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/ /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py && sleep 0' 25052 1726882470.42334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.42348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.42362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.42385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.42409: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882470.42445: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882470.42457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.42536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.42561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.42576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.42677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.44448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.44455: stdout chunk (state=3): >>><<< 25052 1726882470.44461: stderr chunk (state=3): >>><<< 25052 1726882470.44499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.44510: _low_level_execute_command(): starting 25052 1726882470.44513: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/AnsiballZ_command.py && sleep 0' 25052 1726882470.45383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.45386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.45389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.45391: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.45590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.45595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.45618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.45719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.61271: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:34:30.607198", "end": "2024-09-20 21:34:30.610912", "delta": "0:00:00.003714", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882470.62785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882470.62790: stdout chunk (state=3): >>><<< 25052 1726882470.62792: stderr chunk (state=3): >>><<< 25052 1726882470.62818: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "veth0", "up"], "start": "2024-09-20 21:34:30.607198", "end": "2024-09-20 21:34:30.610912", "delta": "0:00:00.003714", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set veth0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882470.62904: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set veth0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882470.62907: _low_level_execute_command(): starting 25052 1726882470.62910: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882470.330575-25369-92492525850118/ > /dev/null 2>&1 && sleep 0' 25052 1726882470.63512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.63568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.63645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.63674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.63699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.63802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.65655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.65683: stdout chunk (state=3): >>><<< 25052 1726882470.65686: stderr chunk (state=3): >>><<< 25052 1726882470.65708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.65798: handler run complete 25052 1726882470.65803: Evaluated conditional (False): False 25052 1726882470.65806: attempt loop complete, returning result 25052 1726882470.65808: variable 'item' from source: unknown 25052 1726882470.65870: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "veth0", "up" ], "delta": "0:00:00.003714", "end": "2024-09-20 21:34:30.610912", "item": "ip link set veth0 up", "rc": 0, "start": "2024-09-20 21:34:30.607198" } 25052 1726882470.66299: dumping result to json 25052 1726882470.66302: done dumping result, returning 25052 1726882470.66313: done running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 [12673a56-9f93-f7f6-4a6d-00000000015a] 25052 1726882470.66315: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015a 25052 1726882470.66364: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015a 25052 1726882470.66367: WORKER PROCESS EXITING 25052 1726882470.66443: no more pending results, returning what we have 25052 1726882470.66447: results queue empty 25052 1726882470.66448: checking for any_errors_fatal 25052 1726882470.66452: done checking for any_errors_fatal 25052 1726882470.66453: checking for max_fail_percentage 25052 1726882470.66454: done checking for max_fail_percentage 25052 1726882470.66455: checking to see if all hosts have failed and the running result is not ok 25052 1726882470.66456: done checking to see if all hosts have failed 25052 1726882470.66457: getting the remaining hosts for this loop 25052 1726882470.66458: done getting the remaining hosts for this loop 25052 1726882470.66461: getting the next task for host managed_node2 25052 1726882470.66468: done getting next task for host managed_node2 25052 1726882470.66470: ^ task is: TASK: Set up veth as managed by NetworkManager 25052 1726882470.66473: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882470.66477: getting variables 25052 1726882470.66479: in VariableManager get_vars() 25052 1726882470.66515: Calling all_inventory to load vars for managed_node2 25052 1726882470.66518: Calling groups_inventory to load vars for managed_node2 25052 1726882470.66638: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882470.66648: Calling all_plugins_play to load vars for managed_node2 25052 1726882470.66651: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882470.66655: Calling groups_plugins_play to load vars for managed_node2 25052 1726882470.66926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882470.67139: done with get_vars() 25052 1726882470.67149: done getting variables 25052 1726882470.67214: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:34:30 -0400 (0:00:01.169) 0:00:07.627 ****** 25052 1726882470.67247: entering _queue_task() for managed_node2/command 25052 1726882470.67540: worker is 1 (out of 1 available) 25052 1726882470.67551: exiting _queue_task() for managed_node2/command 25052 1726882470.67563: done queuing things up, now waiting for results queue to drain 25052 1726882470.67564: waiting for pending results... 25052 1726882470.67851: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 25052 1726882470.67949: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015b 25052 1726882470.67952: variable 'ansible_search_path' from source: unknown 25052 1726882470.67956: variable 'ansible_search_path' from source: unknown 25052 1726882470.67962: calling self._execute() 25052 1726882470.68053: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.68067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.68081: variable 'omit' from source: magic vars 25052 1726882470.68414: variable 'ansible_distribution_major_version' from source: facts 25052 1726882470.68431: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882470.68603: variable 'type' from source: play vars 25052 1726882470.68606: variable 'state' from source: include params 25052 1726882470.68608: Evaluated conditional (type == 'veth' and state == 'present'): True 25052 1726882470.68709: variable 'omit' from source: magic vars 25052 1726882470.68713: variable 'omit' from source: magic vars 25052 1726882470.68755: variable 'interface' from source: play vars 25052 1726882470.68776: variable 'omit' from source: magic vars 25052 1726882470.68824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882470.68864: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882470.68888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882470.68911: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882470.68937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882470.68973: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882470.68981: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.68988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.69099: Set connection var ansible_pipelining to False 25052 1726882470.69143: Set connection var ansible_connection to ssh 25052 1726882470.69146: Set connection var ansible_shell_type to sh 25052 1726882470.69148: Set connection var ansible_timeout to 10 25052 1726882470.69150: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882470.69152: Set connection var ansible_shell_executable to /bin/sh 25052 1726882470.69169: variable 'ansible_shell_executable' from source: unknown 25052 1726882470.69176: variable 'ansible_connection' from source: unknown 25052 1726882470.69182: variable 'ansible_module_compression' from source: unknown 25052 1726882470.69188: variable 'ansible_shell_type' from source: unknown 25052 1726882470.69196: variable 'ansible_shell_executable' from source: unknown 25052 1726882470.69253: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882470.69256: variable 'ansible_pipelining' from source: unknown 25052 1726882470.69258: variable 'ansible_timeout' from source: unknown 25052 1726882470.69260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882470.69416: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882470.69430: variable 'omit' from source: magic vars 25052 1726882470.69439: starting attempt loop 25052 1726882470.69444: running the handler 25052 1726882470.69469: _low_level_execute_command(): starting 25052 1726882470.69486: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882470.70287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.70328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.70348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.70372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.70466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.72180: stdout chunk (state=3): >>>/root <<< 25052 1726882470.72282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.72299: stdout chunk (state=3): >>><<< 25052 1726882470.72390: stderr chunk (state=3): >>><<< 25052 1726882470.72398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.72402: _low_level_execute_command(): starting 25052 1726882470.72406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788 `" && echo ansible-tmp-1726882470.7233338-25430-136571536771788="` echo /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788 `" ) && sleep 0' 25052 1726882470.73005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.73021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.73084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.73155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.73210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.73281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.75728: stdout chunk (state=3): >>>ansible-tmp-1726882470.7233338-25430-136571536771788=/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788 <<< 25052 1726882470.75732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.75734: stdout chunk (state=3): >>><<< 25052 1726882470.75736: stderr chunk (state=3): >>><<< 25052 1726882470.75739: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882470.7233338-25430-136571536771788=/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.75741: variable 'ansible_module_compression' from source: unknown 25052 1726882470.76101: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882470.76105: variable 'ansible_facts' from source: unknown 25052 1726882470.76443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py 25052 1726882470.76791: Sending initial data 25052 1726882470.76799: Sent initial data (156 bytes) 25052 1726882470.78251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.78317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.80001: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25052 1726882470.80067: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882470.80124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpy2zsd_up /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py <<< 25052 1726882470.80137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py" <<< 25052 1726882470.80276: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpy2zsd_up" to remote "/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py" <<< 25052 1726882470.81805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.81925: stderr chunk (state=3): >>><<< 25052 1726882470.81928: stdout chunk (state=3): >>><<< 25052 1726882470.81931: done transferring module to remote 25052 1726882470.81933: _low_level_execute_command(): starting 25052 1726882470.81935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/ /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py && sleep 0' 25052 1726882470.83128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.83140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.83235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.83283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882470.85079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882470.85397: stderr chunk (state=3): >>><<< 25052 1726882470.85401: stdout chunk (state=3): >>><<< 25052 1726882470.85403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882470.85411: _low_level_execute_command(): starting 25052 1726882470.85413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/AnsiballZ_command.py && sleep 0' 25052 1726882470.86604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882470.86625: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882470.86641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882470.86660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882470.86723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882470.86777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882470.86797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882470.86823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882470.87128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.03926: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:34:31.019880", "end": "2024-09-20 21:34:31.038161", "delta": "0:00:00.018281", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882471.05674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882471.05678: stdout chunk (state=3): >>><<< 25052 1726882471.05681: stderr chunk (state=3): >>><<< 25052 1726882471.06036: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "veth0", "managed", "true"], "start": "2024-09-20 21:34:31.019880", "end": "2024-09-20 21:34:31.038161", "delta": "0:00:00.018281", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set veth0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882471.06041: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set veth0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882471.06044: _low_level_execute_command(): starting 25052 1726882471.06046: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882470.7233338-25430-136571536771788/ > /dev/null 2>&1 && sleep 0' 25052 1726882471.06599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882471.06613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.06704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882471.06717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.06728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.06740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.06833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.08800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.08803: stdout chunk (state=3): >>><<< 25052 1726882471.08806: stderr chunk (state=3): >>><<< 25052 1726882471.08808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.08830: handler run complete 25052 1726882471.08858: Evaluated conditional (False): False 25052 1726882471.08914: attempt loop complete, returning result 25052 1726882471.09201: _execute() done 25052 1726882471.09206: dumping result to json 25052 1726882471.09209: done dumping result, returning 25052 1726882471.09211: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-f7f6-4a6d-00000000015b] 25052 1726882471.09213: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015b 25052 1726882471.09286: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015b 25052 1726882471.09289: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "veth0", "managed", "true" ], "delta": "0:00:00.018281", "end": "2024-09-20 21:34:31.038161", "rc": 0, "start": "2024-09-20 21:34:31.019880" } 25052 1726882471.09371: no more pending results, returning what we have 25052 1726882471.09375: results queue empty 25052 1726882471.09376: checking for any_errors_fatal 25052 1726882471.09391: done checking for any_errors_fatal 25052 1726882471.09396: checking for max_fail_percentage 25052 1726882471.09399: done checking for max_fail_percentage 25052 1726882471.09400: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.09400: done checking to see if all hosts have failed 25052 1726882471.09401: getting the remaining hosts for this loop 25052 1726882471.09409: done getting the remaining hosts for this loop 25052 1726882471.09413: getting the next task for host managed_node2 25052 1726882471.09421: done getting next task for host managed_node2 25052 1726882471.09424: ^ task is: TASK: Delete veth interface {{ interface }} 25052 1726882471.09427: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.09432: getting variables 25052 1726882471.09434: in VariableManager get_vars() 25052 1726882471.09477: Calling all_inventory to load vars for managed_node2 25052 1726882471.09481: Calling groups_inventory to load vars for managed_node2 25052 1726882471.09484: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.09681: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.09686: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.09695: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.10165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.10798: done with get_vars() 25052 1726882471.10811: done getting variables 25052 1726882471.10919: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882471.11151: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:34:31 -0400 (0:00:00.439) 0:00:08.066 ****** 25052 1726882471.11181: entering _queue_task() for managed_node2/command 25052 1726882471.11885: worker is 1 (out of 1 available) 25052 1726882471.11983: exiting _queue_task() for managed_node2/command 25052 1726882471.11999: done queuing things up, now waiting for results queue to drain 25052 1726882471.12000: waiting for pending results... 25052 1726882471.12214: running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 25052 1726882471.12280: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015c 25052 1726882471.12308: variable 'ansible_search_path' from source: unknown 25052 1726882471.12401: variable 'ansible_search_path' from source: unknown 25052 1726882471.12405: calling self._execute() 25052 1726882471.12442: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.12451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.12465: variable 'omit' from source: magic vars 25052 1726882471.12820: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.12836: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.13047: variable 'type' from source: play vars 25052 1726882471.13064: variable 'state' from source: include params 25052 1726882471.13072: variable 'interface' from source: play vars 25052 1726882471.13172: variable 'current_interfaces' from source: set_fact 25052 1726882471.13175: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 25052 1726882471.13177: when evaluation is False, skipping this task 25052 1726882471.13179: _execute() done 25052 1726882471.13181: dumping result to json 25052 1726882471.13183: done dumping result, returning 25052 1726882471.13185: done running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 [12673a56-9f93-f7f6-4a6d-00000000015c] 25052 1726882471.13186: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015c 25052 1726882471.13247: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015c 25052 1726882471.13250: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882471.13328: no more pending results, returning what we have 25052 1726882471.13332: results queue empty 25052 1726882471.13333: checking for any_errors_fatal 25052 1726882471.13343: done checking for any_errors_fatal 25052 1726882471.13344: checking for max_fail_percentage 25052 1726882471.13345: done checking for max_fail_percentage 25052 1726882471.13346: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.13347: done checking to see if all hosts have failed 25052 1726882471.13347: getting the remaining hosts for this loop 25052 1726882471.13349: done getting the remaining hosts for this loop 25052 1726882471.13352: getting the next task for host managed_node2 25052 1726882471.13497: done getting next task for host managed_node2 25052 1726882471.13501: ^ task is: TASK: Create dummy interface {{ interface }} 25052 1726882471.13503: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.13507: getting variables 25052 1726882471.13509: in VariableManager get_vars() 25052 1726882471.13541: Calling all_inventory to load vars for managed_node2 25052 1726882471.13544: Calling groups_inventory to load vars for managed_node2 25052 1726882471.13546: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.13555: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.13557: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.13560: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.13727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.14146: done with get_vars() 25052 1726882471.14156: done getting variables 25052 1726882471.14433: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882471.14646: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:34:31 -0400 (0:00:00.034) 0:00:08.101 ****** 25052 1726882471.14676: entering _queue_task() for managed_node2/command 25052 1726882471.15238: worker is 1 (out of 1 available) 25052 1726882471.15251: exiting _queue_task() for managed_node2/command 25052 1726882471.15263: done queuing things up, now waiting for results queue to drain 25052 1726882471.15264: waiting for pending results... 25052 1726882471.15727: running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 25052 1726882471.15737: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015d 25052 1726882471.15740: variable 'ansible_search_path' from source: unknown 25052 1726882471.15743: variable 'ansible_search_path' from source: unknown 25052 1726882471.15746: calling self._execute() 25052 1726882471.15748: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.15750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.15753: variable 'omit' from source: magic vars 25052 1726882471.16123: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.16138: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.16347: variable 'type' from source: play vars 25052 1726882471.16357: variable 'state' from source: include params 25052 1726882471.16370: variable 'interface' from source: play vars 25052 1726882471.16378: variable 'current_interfaces' from source: set_fact 25052 1726882471.16432: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25052 1726882471.16435: when evaluation is False, skipping this task 25052 1726882471.16438: _execute() done 25052 1726882471.16440: dumping result to json 25052 1726882471.16442: done dumping result, returning 25052 1726882471.16444: done running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 [12673a56-9f93-f7f6-4a6d-00000000015d] 25052 1726882471.16446: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015d skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882471.16582: no more pending results, returning what we have 25052 1726882471.16586: results queue empty 25052 1726882471.16587: checking for any_errors_fatal 25052 1726882471.16595: done checking for any_errors_fatal 25052 1726882471.16596: checking for max_fail_percentage 25052 1726882471.16597: done checking for max_fail_percentage 25052 1726882471.16598: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.16599: done checking to see if all hosts have failed 25052 1726882471.16600: getting the remaining hosts for this loop 25052 1726882471.16601: done getting the remaining hosts for this loop 25052 1726882471.16605: getting the next task for host managed_node2 25052 1726882471.16611: done getting next task for host managed_node2 25052 1726882471.16613: ^ task is: TASK: Delete dummy interface {{ interface }} 25052 1726882471.16616: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.16620: getting variables 25052 1726882471.16622: in VariableManager get_vars() 25052 1726882471.16660: Calling all_inventory to load vars for managed_node2 25052 1726882471.16663: Calling groups_inventory to load vars for managed_node2 25052 1726882471.16666: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.16677: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.16680: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.16683: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.17075: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015d 25052 1726882471.17078: WORKER PROCESS EXITING 25052 1726882471.17100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.17302: done with get_vars() 25052 1726882471.17312: done getting variables 25052 1726882471.17375: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882471.17486: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:34:31 -0400 (0:00:00.028) 0:00:08.130 ****** 25052 1726882471.17517: entering _queue_task() for managed_node2/command 25052 1726882471.17742: worker is 1 (out of 1 available) 25052 1726882471.17756: exiting _queue_task() for managed_node2/command 25052 1726882471.17769: done queuing things up, now waiting for results queue to drain 25052 1726882471.17770: waiting for pending results... 25052 1726882471.18030: running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 25052 1726882471.18133: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015e 25052 1726882471.18152: variable 'ansible_search_path' from source: unknown 25052 1726882471.18160: variable 'ansible_search_path' from source: unknown 25052 1726882471.18201: calling self._execute() 25052 1726882471.18291: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.18304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.18318: variable 'omit' from source: magic vars 25052 1726882471.18675: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.18691: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.18890: variable 'type' from source: play vars 25052 1726882471.18902: variable 'state' from source: include params 25052 1726882471.18909: variable 'interface' from source: play vars 25052 1726882471.18915: variable 'current_interfaces' from source: set_fact 25052 1726882471.18925: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25052 1726882471.18930: when evaluation is False, skipping this task 25052 1726882471.18935: _execute() done 25052 1726882471.18942: dumping result to json 25052 1726882471.18947: done dumping result, returning 25052 1726882471.18955: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 [12673a56-9f93-f7f6-4a6d-00000000015e] 25052 1726882471.18962: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015e skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882471.19136: no more pending results, returning what we have 25052 1726882471.19140: results queue empty 25052 1726882471.19141: checking for any_errors_fatal 25052 1726882471.19146: done checking for any_errors_fatal 25052 1726882471.19147: checking for max_fail_percentage 25052 1726882471.19148: done checking for max_fail_percentage 25052 1726882471.19149: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.19150: done checking to see if all hosts have failed 25052 1726882471.19151: getting the remaining hosts for this loop 25052 1726882471.19152: done getting the remaining hosts for this loop 25052 1726882471.19155: getting the next task for host managed_node2 25052 1726882471.19162: done getting next task for host managed_node2 25052 1726882471.19164: ^ task is: TASK: Create tap interface {{ interface }} 25052 1726882471.19168: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.19172: getting variables 25052 1726882471.19174: in VariableManager get_vars() 25052 1726882471.19324: Calling all_inventory to load vars for managed_node2 25052 1726882471.19327: Calling groups_inventory to load vars for managed_node2 25052 1726882471.19329: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.19339: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.19341: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.19344: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.19553: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015e 25052 1726882471.19556: WORKER PROCESS EXITING 25052 1726882471.19577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.19782: done with get_vars() 25052 1726882471.19792: done getting variables 25052 1726882471.19850: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882471.19950: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:34:31 -0400 (0:00:00.024) 0:00:08.154 ****** 25052 1726882471.19982: entering _queue_task() for managed_node2/command 25052 1726882471.20416: worker is 1 (out of 1 available) 25052 1726882471.20424: exiting _queue_task() for managed_node2/command 25052 1726882471.20433: done queuing things up, now waiting for results queue to drain 25052 1726882471.20435: waiting for pending results... 25052 1726882471.20465: running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 25052 1726882471.20566: in run() - task 12673a56-9f93-f7f6-4a6d-00000000015f 25052 1726882471.20583: variable 'ansible_search_path' from source: unknown 25052 1726882471.20590: variable 'ansible_search_path' from source: unknown 25052 1726882471.20626: calling self._execute() 25052 1726882471.20711: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.20721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.20733: variable 'omit' from source: magic vars 25052 1726882471.21056: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.21071: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.21277: variable 'type' from source: play vars 25052 1726882471.21287: variable 'state' from source: include params 25052 1726882471.21297: variable 'interface' from source: play vars 25052 1726882471.21311: variable 'current_interfaces' from source: set_fact 25052 1726882471.21323: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25052 1726882471.21329: when evaluation is False, skipping this task 25052 1726882471.21335: _execute() done 25052 1726882471.21341: dumping result to json 25052 1726882471.21348: done dumping result, returning 25052 1726882471.21357: done running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 [12673a56-9f93-f7f6-4a6d-00000000015f] 25052 1726882471.21364: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015f skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882471.21567: no more pending results, returning what we have 25052 1726882471.21571: results queue empty 25052 1726882471.21572: checking for any_errors_fatal 25052 1726882471.21577: done checking for any_errors_fatal 25052 1726882471.21578: checking for max_fail_percentage 25052 1726882471.21579: done checking for max_fail_percentage 25052 1726882471.21580: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.21581: done checking to see if all hosts have failed 25052 1726882471.21582: getting the remaining hosts for this loop 25052 1726882471.21583: done getting the remaining hosts for this loop 25052 1726882471.21586: getting the next task for host managed_node2 25052 1726882471.21595: done getting next task for host managed_node2 25052 1726882471.21598: ^ task is: TASK: Delete tap interface {{ interface }} 25052 1726882471.21601: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.21605: getting variables 25052 1726882471.21607: in VariableManager get_vars() 25052 1726882471.21650: Calling all_inventory to load vars for managed_node2 25052 1726882471.21652: Calling groups_inventory to load vars for managed_node2 25052 1726882471.21655: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.21667: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.21670: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.21673: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.22001: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000015f 25052 1726882471.22004: WORKER PROCESS EXITING 25052 1726882471.22027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.22212: done with get_vars() 25052 1726882471.22220: done getting variables 25052 1726882471.22267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882471.22363: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:34:31 -0400 (0:00:00.024) 0:00:08.178 ****** 25052 1726882471.22392: entering _queue_task() for managed_node2/command 25052 1726882471.22735: worker is 1 (out of 1 available) 25052 1726882471.22746: exiting _queue_task() for managed_node2/command 25052 1726882471.22757: done queuing things up, now waiting for results queue to drain 25052 1726882471.22758: waiting for pending results... 25052 1726882471.22912: running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 25052 1726882471.23026: in run() - task 12673a56-9f93-f7f6-4a6d-000000000160 25052 1726882471.23044: variable 'ansible_search_path' from source: unknown 25052 1726882471.23058: variable 'ansible_search_path' from source: unknown 25052 1726882471.23109: calling self._execute() 25052 1726882471.23202: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.23271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.23276: variable 'omit' from source: magic vars 25052 1726882471.23583: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.23606: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.23816: variable 'type' from source: play vars 25052 1726882471.23827: variable 'state' from source: include params 25052 1726882471.23836: variable 'interface' from source: play vars 25052 1726882471.23846: variable 'current_interfaces' from source: set_fact 25052 1726882471.23861: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25052 1726882471.23898: when evaluation is False, skipping this task 25052 1726882471.23901: _execute() done 25052 1726882471.23903: dumping result to json 25052 1726882471.23906: done dumping result, returning 25052 1726882471.23908: done running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 [12673a56-9f93-f7f6-4a6d-000000000160] 25052 1726882471.23910: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000160 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882471.24210: no more pending results, returning what we have 25052 1726882471.24213: results queue empty 25052 1726882471.24214: checking for any_errors_fatal 25052 1726882471.24218: done checking for any_errors_fatal 25052 1726882471.24219: checking for max_fail_percentage 25052 1726882471.24220: done checking for max_fail_percentage 25052 1726882471.24221: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.24221: done checking to see if all hosts have failed 25052 1726882471.24222: getting the remaining hosts for this loop 25052 1726882471.24223: done getting the remaining hosts for this loop 25052 1726882471.24225: getting the next task for host managed_node2 25052 1726882471.24231: done getting next task for host managed_node2 25052 1726882471.24233: ^ task is: TASK: Set up gateway ip on veth peer 25052 1726882471.24235: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.24238: getting variables 25052 1726882471.24240: in VariableManager get_vars() 25052 1726882471.24269: Calling all_inventory to load vars for managed_node2 25052 1726882471.24271: Calling groups_inventory to load vars for managed_node2 25052 1726882471.24273: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.24281: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.24283: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.24285: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.24479: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000160 25052 1726882471.24483: WORKER PROCESS EXITING 25052 1726882471.24510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.24707: done with get_vars() 25052 1726882471.24716: done getting variables 25052 1726882471.24805: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set up gateway ip on veth peer] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:15 Friday 20 September 2024 21:34:31 -0400 (0:00:00.024) 0:00:08.203 ****** 25052 1726882471.24836: entering _queue_task() for managed_node2/shell 25052 1726882471.24838: Creating lock for shell 25052 1726882471.25179: worker is 1 (out of 1 available) 25052 1726882471.25189: exiting _queue_task() for managed_node2/shell 25052 1726882471.25201: done queuing things up, now waiting for results queue to drain 25052 1726882471.25202: waiting for pending results... 25052 1726882471.25349: running TaskExecutor() for managed_node2/TASK: Set up gateway ip on veth peer 25052 1726882471.25439: in run() - task 12673a56-9f93-f7f6-4a6d-00000000000d 25052 1726882471.25457: variable 'ansible_search_path' from source: unknown 25052 1726882471.25501: calling self._execute() 25052 1726882471.25584: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.25602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.25617: variable 'omit' from source: magic vars 25052 1726882471.25959: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.25978: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.25988: variable 'omit' from source: magic vars 25052 1726882471.26021: variable 'omit' from source: magic vars 25052 1726882471.26164: variable 'interface' from source: play vars 25052 1726882471.26190: variable 'omit' from source: magic vars 25052 1726882471.26235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882471.26276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882471.26308: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882471.26328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.26345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.26380: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882471.26388: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.26400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.26508: Set connection var ansible_pipelining to False 25052 1726882471.26516: Set connection var ansible_connection to ssh 25052 1726882471.26522: Set connection var ansible_shell_type to sh 25052 1726882471.26533: Set connection var ansible_timeout to 10 25052 1726882471.26543: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882471.26552: Set connection var ansible_shell_executable to /bin/sh 25052 1726882471.26579: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.26586: variable 'ansible_connection' from source: unknown 25052 1726882471.26592: variable 'ansible_module_compression' from source: unknown 25052 1726882471.26601: variable 'ansible_shell_type' from source: unknown 25052 1726882471.26607: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.26617: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.26624: variable 'ansible_pipelining' from source: unknown 25052 1726882471.26686: variable 'ansible_timeout' from source: unknown 25052 1726882471.26689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.26777: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882471.26798: variable 'omit' from source: magic vars 25052 1726882471.26808: starting attempt loop 25052 1726882471.26814: running the handler 25052 1726882471.26829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882471.26852: _low_level_execute_command(): starting 25052 1726882471.26863: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882471.27588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882471.27603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.27676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.27729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.27755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.27779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.27897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.29513: stdout chunk (state=3): >>>/root <<< 25052 1726882471.29653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.29656: stdout chunk (state=3): >>><<< 25052 1726882471.29658: stderr chunk (state=3): >>><<< 25052 1726882471.29758: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.29761: _low_level_execute_command(): starting 25052 1726882471.29764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462 `" && echo ansible-tmp-1726882471.2968113-25465-6883277363462="` echo /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462 `" ) && sleep 0' 25052 1726882471.30276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882471.30291: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.30310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.30349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882471.30459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.30473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.30512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.30577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.32443: stdout chunk (state=3): >>>ansible-tmp-1726882471.2968113-25465-6883277363462=/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462 <<< 25052 1726882471.32595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.32599: stdout chunk (state=3): >>><<< 25052 1726882471.32601: stderr chunk (state=3): >>><<< 25052 1726882471.32619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882471.2968113-25465-6883277363462=/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.32799: variable 'ansible_module_compression' from source: unknown 25052 1726882471.32803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882471.32805: variable 'ansible_facts' from source: unknown 25052 1726882471.32838: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py 25052 1726882471.33043: Sending initial data 25052 1726882471.33143: Sent initial data (154 bytes) 25052 1726882471.33668: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882471.33686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.33704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.33722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882471.33737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882471.33799: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.33845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.33861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.33880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.33970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.35514: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882471.35571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882471.35640: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp12s97hix /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py <<< 25052 1726882471.35643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py" <<< 25052 1726882471.35705: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp12s97hix" to remote "/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py" <<< 25052 1726882471.36717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.36721: stderr chunk (state=3): >>><<< 25052 1726882471.36723: stdout chunk (state=3): >>><<< 25052 1726882471.36726: done transferring module to remote 25052 1726882471.36728: _low_level_execute_command(): starting 25052 1726882471.36730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/ /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py && sleep 0' 25052 1726882471.37310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.37332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882471.37351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882471.37354: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882471.37409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.37442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.37461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.37476: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.37569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.39346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.39349: stdout chunk (state=3): >>><<< 25052 1726882471.39352: stderr chunk (state=3): >>><<< 25052 1726882471.39448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.39452: _low_level_execute_command(): starting 25052 1726882471.39455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/AnsiballZ_command.py && sleep 0' 25052 1726882471.40028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882471.40045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.40061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.40079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882471.40100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882471.40149: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.40218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.40236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.40268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.40371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.57806: stdout chunk (state=3): >>> <<< 25052 1726882471.57813: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:34:31.552550", "end": "2024-09-20 21:34:31.576917", "delta": "0:00:00.024367", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882471.59510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882471.59537: stderr chunk (state=3): >>><<< 25052 1726882471.59541: stdout chunk (state=3): >>><<< 25052 1726882471.59560: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "start": "2024-09-20 21:34:31.552550", "end": "2024-09-20 21:34:31.576917", "delta": "0:00:00.024367", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882471.59589: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882471.59601: _low_level_execute_command(): starting 25052 1726882471.59604: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882471.2968113-25465-6883277363462/ > /dev/null 2>&1 && sleep 0' 25052 1726882471.60070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.60073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.60076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.60078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882471.60080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.60124: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.60135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.60206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.62024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.62051: stderr chunk (state=3): >>><<< 25052 1726882471.62057: stdout chunk (state=3): >>><<< 25052 1726882471.62069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.62075: handler run complete 25052 1726882471.62092: Evaluated conditional (False): False 25052 1726882471.62106: attempt loop complete, returning result 25052 1726882471.62109: _execute() done 25052 1726882471.62111: dumping result to json 25052 1726882471.62113: done dumping result, returning 25052 1726882471.62122: done running TaskExecutor() for managed_node2/TASK: Set up gateway ip on veth peer [12673a56-9f93-f7f6-4a6d-00000000000d] 25052 1726882471.62126: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000d 25052 1726882471.62225: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000d 25052 1726882471.62228: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "ip netns add ns1\nip link set peerveth0 netns ns1\nip netns exec ns1 ip -6 addr add 2001:db8::1/32 dev peerveth0\nip netns exec ns1 ip link set peerveth0 up\n", "delta": "0:00:00.024367", "end": "2024-09-20 21:34:31.576917", "rc": 0, "start": "2024-09-20 21:34:31.552550" } 25052 1726882471.62318: no more pending results, returning what we have 25052 1726882471.62322: results queue empty 25052 1726882471.62323: checking for any_errors_fatal 25052 1726882471.62328: done checking for any_errors_fatal 25052 1726882471.62329: checking for max_fail_percentage 25052 1726882471.62330: done checking for max_fail_percentage 25052 1726882471.62331: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.62332: done checking to see if all hosts have failed 25052 1726882471.62333: getting the remaining hosts for this loop 25052 1726882471.62334: done getting the remaining hosts for this loop 25052 1726882471.62339: getting the next task for host managed_node2 25052 1726882471.62344: done getting next task for host managed_node2 25052 1726882471.62347: ^ task is: TASK: TEST: I can configure an interface with static ipv6 config 25052 1726882471.62348: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.62353: getting variables 25052 1726882471.62354: in VariableManager get_vars() 25052 1726882471.62389: Calling all_inventory to load vars for managed_node2 25052 1726882471.62392: Calling groups_inventory to load vars for managed_node2 25052 1726882471.62396: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.62405: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.62407: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.62409: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.62576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.62690: done with get_vars() 25052 1726882471.62700: done getting variables 25052 1726882471.62741: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST: I can configure an interface with static ipv6 config] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:27 Friday 20 September 2024 21:34:31 -0400 (0:00:00.379) 0:00:08.582 ****** 25052 1726882471.62760: entering _queue_task() for managed_node2/debug 25052 1726882471.62957: worker is 1 (out of 1 available) 25052 1726882471.62969: exiting _queue_task() for managed_node2/debug 25052 1726882471.62980: done queuing things up, now waiting for results queue to drain 25052 1726882471.62981: waiting for pending results... 25052 1726882471.63134: running TaskExecutor() for managed_node2/TASK: TEST: I can configure an interface with static ipv6 config 25052 1726882471.63188: in run() - task 12673a56-9f93-f7f6-4a6d-00000000000f 25052 1726882471.63206: variable 'ansible_search_path' from source: unknown 25052 1726882471.63234: calling self._execute() 25052 1726882471.63300: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.63303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.63312: variable 'omit' from source: magic vars 25052 1726882471.63564: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.63573: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.63578: variable 'omit' from source: magic vars 25052 1726882471.63595: variable 'omit' from source: magic vars 25052 1726882471.63621: variable 'omit' from source: magic vars 25052 1726882471.63652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882471.63680: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882471.63699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882471.63713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.63723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.63745: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882471.63750: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.63752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.63824: Set connection var ansible_pipelining to False 25052 1726882471.63827: Set connection var ansible_connection to ssh 25052 1726882471.63829: Set connection var ansible_shell_type to sh 25052 1726882471.63834: Set connection var ansible_timeout to 10 25052 1726882471.63841: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882471.63846: Set connection var ansible_shell_executable to /bin/sh 25052 1726882471.63862: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.63865: variable 'ansible_connection' from source: unknown 25052 1726882471.63867: variable 'ansible_module_compression' from source: unknown 25052 1726882471.63870: variable 'ansible_shell_type' from source: unknown 25052 1726882471.63873: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.63876: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.63878: variable 'ansible_pipelining' from source: unknown 25052 1726882471.63880: variable 'ansible_timeout' from source: unknown 25052 1726882471.63882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.63979: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882471.63988: variable 'omit' from source: magic vars 25052 1726882471.64001: starting attempt loop 25052 1726882471.64004: running the handler 25052 1726882471.64036: handler run complete 25052 1726882471.64049: attempt loop complete, returning result 25052 1726882471.64052: _execute() done 25052 1726882471.64054: dumping result to json 25052 1726882471.64056: done dumping result, returning 25052 1726882471.64062: done running TaskExecutor() for managed_node2/TASK: TEST: I can configure an interface with static ipv6 config [12673a56-9f93-f7f6-4a6d-00000000000f] 25052 1726882471.64067: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000f 25052 1726882471.64147: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000000f 25052 1726882471.64150: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: ################################################## 25052 1726882471.64199: no more pending results, returning what we have 25052 1726882471.64203: results queue empty 25052 1726882471.64204: checking for any_errors_fatal 25052 1726882471.64210: done checking for any_errors_fatal 25052 1726882471.64211: checking for max_fail_percentage 25052 1726882471.64212: done checking for max_fail_percentage 25052 1726882471.64213: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.64213: done checking to see if all hosts have failed 25052 1726882471.64214: getting the remaining hosts for this loop 25052 1726882471.64215: done getting the remaining hosts for this loop 25052 1726882471.64218: getting the next task for host managed_node2 25052 1726882471.64225: done getting next task for host managed_node2 25052 1726882471.64231: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25052 1726882471.64233: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.64246: getting variables 25052 1726882471.64247: in VariableManager get_vars() 25052 1726882471.64285: Calling all_inventory to load vars for managed_node2 25052 1726882471.64288: Calling groups_inventory to load vars for managed_node2 25052 1726882471.64290: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.64299: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.64301: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.64303: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.64411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.64532: done with get_vars() 25052 1726882471.64540: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:31 -0400 (0:00:00.018) 0:00:08.600 ****** 25052 1726882471.64604: entering _queue_task() for managed_node2/include_tasks 25052 1726882471.64778: worker is 1 (out of 1 available) 25052 1726882471.64792: exiting _queue_task() for managed_node2/include_tasks 25052 1726882471.64803: done queuing things up, now waiting for results queue to drain 25052 1726882471.64804: waiting for pending results... 25052 1726882471.64944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25052 1726882471.65028: in run() - task 12673a56-9f93-f7f6-4a6d-000000000017 25052 1726882471.65040: variable 'ansible_search_path' from source: unknown 25052 1726882471.65043: variable 'ansible_search_path' from source: unknown 25052 1726882471.65067: calling self._execute() 25052 1726882471.65123: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.65126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.65134: variable 'omit' from source: magic vars 25052 1726882471.65375: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.65384: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.65390: _execute() done 25052 1726882471.65395: dumping result to json 25052 1726882471.65400: done dumping result, returning 25052 1726882471.65406: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-f7f6-4a6d-000000000017] 25052 1726882471.65410: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000017 25052 1726882471.65490: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000017 25052 1726882471.65501: WORKER PROCESS EXITING 25052 1726882471.65538: no more pending results, returning what we have 25052 1726882471.65542: in VariableManager get_vars() 25052 1726882471.65576: Calling all_inventory to load vars for managed_node2 25052 1726882471.65578: Calling groups_inventory to load vars for managed_node2 25052 1726882471.65580: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.65587: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.65590: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.65592: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.65730: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.65842: done with get_vars() 25052 1726882471.65847: variable 'ansible_search_path' from source: unknown 25052 1726882471.65848: variable 'ansible_search_path' from source: unknown 25052 1726882471.65871: we have included files to process 25052 1726882471.65872: generating all_blocks data 25052 1726882471.65873: done generating all_blocks data 25052 1726882471.65876: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882471.65877: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882471.65878: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882471.66322: done processing included file 25052 1726882471.66323: iterating over new_blocks loaded from include file 25052 1726882471.66324: in VariableManager get_vars() 25052 1726882471.66338: done with get_vars() 25052 1726882471.66339: filtering new block on tags 25052 1726882471.66349: done filtering new block on tags 25052 1726882471.66351: in VariableManager get_vars() 25052 1726882471.66362: done with get_vars() 25052 1726882471.66363: filtering new block on tags 25052 1726882471.66377: done filtering new block on tags 25052 1726882471.66378: in VariableManager get_vars() 25052 1726882471.66394: done with get_vars() 25052 1726882471.66395: filtering new block on tags 25052 1726882471.66406: done filtering new block on tags 25052 1726882471.66407: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 25052 1726882471.66410: extending task lists for all hosts with included blocks 25052 1726882471.66888: done extending task lists 25052 1726882471.66889: done processing included files 25052 1726882471.66889: results queue empty 25052 1726882471.66890: checking for any_errors_fatal 25052 1726882471.66891: done checking for any_errors_fatal 25052 1726882471.66892: checking for max_fail_percentage 25052 1726882471.66894: done checking for max_fail_percentage 25052 1726882471.66895: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.66896: done checking to see if all hosts have failed 25052 1726882471.66896: getting the remaining hosts for this loop 25052 1726882471.66897: done getting the remaining hosts for this loop 25052 1726882471.66899: getting the next task for host managed_node2 25052 1726882471.66901: done getting next task for host managed_node2 25052 1726882471.66903: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25052 1726882471.66905: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.66912: getting variables 25052 1726882471.66913: in VariableManager get_vars() 25052 1726882471.66923: Calling all_inventory to load vars for managed_node2 25052 1726882471.66925: Calling groups_inventory to load vars for managed_node2 25052 1726882471.66926: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.66929: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.66931: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.66932: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.67009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.67119: done with get_vars() 25052 1726882471.67125: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:31 -0400 (0:00:00.025) 0:00:08.626 ****** 25052 1726882471.67171: entering _queue_task() for managed_node2/setup 25052 1726882471.67341: worker is 1 (out of 1 available) 25052 1726882471.67354: exiting _queue_task() for managed_node2/setup 25052 1726882471.67364: done queuing things up, now waiting for results queue to drain 25052 1726882471.67365: waiting for pending results... 25052 1726882471.67507: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25052 1726882471.67589: in run() - task 12673a56-9f93-f7f6-4a6d-0000000001fc 25052 1726882471.67603: variable 'ansible_search_path' from source: unknown 25052 1726882471.67607: variable 'ansible_search_path' from source: unknown 25052 1726882471.67632: calling self._execute() 25052 1726882471.67685: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.67690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.67707: variable 'omit' from source: magic vars 25052 1726882471.67938: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.67947: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.68082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882471.69646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882471.69690: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882471.69720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882471.69744: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882471.69763: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882471.69822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882471.69842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882471.69860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882471.69888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882471.69903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882471.69940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882471.69956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882471.69972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882471.70003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882471.70014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882471.70113: variable '__network_required_facts' from source: role '' defaults 25052 1726882471.70120: variable 'ansible_facts' from source: unknown 25052 1726882471.70169: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25052 1726882471.70173: when evaluation is False, skipping this task 25052 1726882471.70176: _execute() done 25052 1726882471.70178: dumping result to json 25052 1726882471.70180: done dumping result, returning 25052 1726882471.70186: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-f7f6-4a6d-0000000001fc] 25052 1726882471.70189: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001fc 25052 1726882471.70272: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001fc 25052 1726882471.70275: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882471.70354: no more pending results, returning what we have 25052 1726882471.70357: results queue empty 25052 1726882471.70358: checking for any_errors_fatal 25052 1726882471.70359: done checking for any_errors_fatal 25052 1726882471.70359: checking for max_fail_percentage 25052 1726882471.70361: done checking for max_fail_percentage 25052 1726882471.70362: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.70362: done checking to see if all hosts have failed 25052 1726882471.70363: getting the remaining hosts for this loop 25052 1726882471.70364: done getting the remaining hosts for this loop 25052 1726882471.70367: getting the next task for host managed_node2 25052 1726882471.70375: done getting next task for host managed_node2 25052 1726882471.70379: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25052 1726882471.70383: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.70395: getting variables 25052 1726882471.70397: in VariableManager get_vars() 25052 1726882471.70432: Calling all_inventory to load vars for managed_node2 25052 1726882471.70435: Calling groups_inventory to load vars for managed_node2 25052 1726882471.70437: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.70444: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.70446: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.70449: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.70559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.70834: done with get_vars() 25052 1726882471.70841: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:31 -0400 (0:00:00.037) 0:00:08.663 ****** 25052 1726882471.70907: entering _queue_task() for managed_node2/stat 25052 1726882471.71086: worker is 1 (out of 1 available) 25052 1726882471.71102: exiting _queue_task() for managed_node2/stat 25052 1726882471.71114: done queuing things up, now waiting for results queue to drain 25052 1726882471.71115: waiting for pending results... 25052 1726882471.71260: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25052 1726882471.71365: in run() - task 12673a56-9f93-f7f6-4a6d-0000000001fe 25052 1726882471.71375: variable 'ansible_search_path' from source: unknown 25052 1726882471.71379: variable 'ansible_search_path' from source: unknown 25052 1726882471.71408: calling self._execute() 25052 1726882471.71467: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.71470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.71479: variable 'omit' from source: magic vars 25052 1726882471.71733: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.71741: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.71856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882471.72043: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882471.72074: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882471.72097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882471.72129: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882471.72185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882471.72206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882471.72228: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882471.72245: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882471.72304: variable '__network_is_ostree' from source: set_fact 25052 1726882471.72310: Evaluated conditional (not __network_is_ostree is defined): False 25052 1726882471.72313: when evaluation is False, skipping this task 25052 1726882471.72315: _execute() done 25052 1726882471.72320: dumping result to json 25052 1726882471.72322: done dumping result, returning 25052 1726882471.72333: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-f7f6-4a6d-0000000001fe] 25052 1726882471.72335: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001fe 25052 1726882471.72409: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001fe 25052 1726882471.72412: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25052 1726882471.72474: no more pending results, returning what we have 25052 1726882471.72477: results queue empty 25052 1726882471.72478: checking for any_errors_fatal 25052 1726882471.72482: done checking for any_errors_fatal 25052 1726882471.72483: checking for max_fail_percentage 25052 1726882471.72484: done checking for max_fail_percentage 25052 1726882471.72485: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.72486: done checking to see if all hosts have failed 25052 1726882471.72487: getting the remaining hosts for this loop 25052 1726882471.72488: done getting the remaining hosts for this loop 25052 1726882471.72490: getting the next task for host managed_node2 25052 1726882471.72499: done getting next task for host managed_node2 25052 1726882471.72502: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25052 1726882471.72505: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.72516: getting variables 25052 1726882471.72517: in VariableManager get_vars() 25052 1726882471.72546: Calling all_inventory to load vars for managed_node2 25052 1726882471.72548: Calling groups_inventory to load vars for managed_node2 25052 1726882471.72550: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.72556: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.72558: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.72559: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.72663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.72787: done with get_vars() 25052 1726882471.72798: done getting variables 25052 1726882471.72834: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:31 -0400 (0:00:00.019) 0:00:08.683 ****** 25052 1726882471.72856: entering _queue_task() for managed_node2/set_fact 25052 1726882471.73027: worker is 1 (out of 1 available) 25052 1726882471.73039: exiting _queue_task() for managed_node2/set_fact 25052 1726882471.73050: done queuing things up, now waiting for results queue to drain 25052 1726882471.73051: waiting for pending results... 25052 1726882471.73185: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25052 1726882471.73267: in run() - task 12673a56-9f93-f7f6-4a6d-0000000001ff 25052 1726882471.73281: variable 'ansible_search_path' from source: unknown 25052 1726882471.73284: variable 'ansible_search_path' from source: unknown 25052 1726882471.73308: calling self._execute() 25052 1726882471.73363: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.73367: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.73375: variable 'omit' from source: magic vars 25052 1726882471.73700: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.73705: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.73727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882471.73959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882471.73988: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882471.74014: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882471.74041: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882471.74099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882471.74115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882471.74132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882471.74153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882471.74212: variable '__network_is_ostree' from source: set_fact 25052 1726882471.74217: Evaluated conditional (not __network_is_ostree is defined): False 25052 1726882471.74220: when evaluation is False, skipping this task 25052 1726882471.74223: _execute() done 25052 1726882471.74225: dumping result to json 25052 1726882471.74227: done dumping result, returning 25052 1726882471.74235: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-f7f6-4a6d-0000000001ff] 25052 1726882471.74239: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001ff 25052 1726882471.74323: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000001ff 25052 1726882471.74326: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25052 1726882471.74397: no more pending results, returning what we have 25052 1726882471.74400: results queue empty 25052 1726882471.74400: checking for any_errors_fatal 25052 1726882471.74405: done checking for any_errors_fatal 25052 1726882471.74406: checking for max_fail_percentage 25052 1726882471.74407: done checking for max_fail_percentage 25052 1726882471.74408: checking to see if all hosts have failed and the running result is not ok 25052 1726882471.74409: done checking to see if all hosts have failed 25052 1726882471.74410: getting the remaining hosts for this loop 25052 1726882471.74410: done getting the remaining hosts for this loop 25052 1726882471.74413: getting the next task for host managed_node2 25052 1726882471.74421: done getting next task for host managed_node2 25052 1726882471.74424: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25052 1726882471.74427: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882471.74439: getting variables 25052 1726882471.74440: in VariableManager get_vars() 25052 1726882471.74464: Calling all_inventory to load vars for managed_node2 25052 1726882471.74466: Calling groups_inventory to load vars for managed_node2 25052 1726882471.74468: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882471.74473: Calling all_plugins_play to load vars for managed_node2 25052 1726882471.74475: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882471.74476: Calling groups_plugins_play to load vars for managed_node2 25052 1726882471.74612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882471.74731: done with get_vars() 25052 1726882471.74737: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:31 -0400 (0:00:00.019) 0:00:08.702 ****** 25052 1726882471.74801: entering _queue_task() for managed_node2/service_facts 25052 1726882471.74802: Creating lock for service_facts 25052 1726882471.74977: worker is 1 (out of 1 available) 25052 1726882471.74989: exiting _queue_task() for managed_node2/service_facts 25052 1726882471.75003: done queuing things up, now waiting for results queue to drain 25052 1726882471.75005: waiting for pending results... 25052 1726882471.75144: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 25052 1726882471.75220: in run() - task 12673a56-9f93-f7f6-4a6d-000000000201 25052 1726882471.75234: variable 'ansible_search_path' from source: unknown 25052 1726882471.75238: variable 'ansible_search_path' from source: unknown 25052 1726882471.75260: calling self._execute() 25052 1726882471.75318: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.75322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.75330: variable 'omit' from source: magic vars 25052 1726882471.75569: variable 'ansible_distribution_major_version' from source: facts 25052 1726882471.75578: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882471.75584: variable 'omit' from source: magic vars 25052 1726882471.75633: variable 'omit' from source: magic vars 25052 1726882471.75656: variable 'omit' from source: magic vars 25052 1726882471.75687: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882471.75715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882471.75730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882471.75743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.75752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882471.75781: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882471.75784: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.75786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.75848: Set connection var ansible_pipelining to False 25052 1726882471.75852: Set connection var ansible_connection to ssh 25052 1726882471.75854: Set connection var ansible_shell_type to sh 25052 1726882471.75859: Set connection var ansible_timeout to 10 25052 1726882471.75865: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882471.75870: Set connection var ansible_shell_executable to /bin/sh 25052 1726882471.75889: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.75897: variable 'ansible_connection' from source: unknown 25052 1726882471.75901: variable 'ansible_module_compression' from source: unknown 25052 1726882471.75903: variable 'ansible_shell_type' from source: unknown 25052 1726882471.75906: variable 'ansible_shell_executable' from source: unknown 25052 1726882471.75908: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882471.75910: variable 'ansible_pipelining' from source: unknown 25052 1726882471.75912: variable 'ansible_timeout' from source: unknown 25052 1726882471.75914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882471.76045: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882471.76052: variable 'omit' from source: magic vars 25052 1726882471.76057: starting attempt loop 25052 1726882471.76060: running the handler 25052 1726882471.76070: _low_level_execute_command(): starting 25052 1726882471.76078: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882471.76582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882471.76587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882471.76591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.76643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882471.76646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.76650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.76724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.78437: stdout chunk (state=3): >>>/root <<< 25052 1726882471.78535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.78561: stderr chunk (state=3): >>><<< 25052 1726882471.78564: stdout chunk (state=3): >>><<< 25052 1726882471.78584: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.78596: _low_level_execute_command(): starting 25052 1726882471.78604: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720 `" && echo ansible-tmp-1726882471.785835-25486-16548198611720="` echo /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720 `" ) && sleep 0' 25052 1726882471.79080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882471.79112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.79148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.79352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.81272: stdout chunk (state=3): >>>ansible-tmp-1726882471.785835-25486-16548198611720=/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720 <<< 25052 1726882471.81419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.81439: stderr chunk (state=3): >>><<< 25052 1726882471.81450: stdout chunk (state=3): >>><<< 25052 1726882471.81503: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882471.785835-25486-16548198611720=/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882471.81551: variable 'ansible_module_compression' from source: unknown 25052 1726882471.81614: ANSIBALLZ: Using lock for service_facts 25052 1726882471.81700: ANSIBALLZ: Acquiring lock 25052 1726882471.81703: ANSIBALLZ: Lock acquired: 140207134765904 25052 1726882471.81706: ANSIBALLZ: Creating module 25052 1726882471.94266: ANSIBALLZ: Writing module into payload 25052 1726882471.94358: ANSIBALLZ: Writing module 25052 1726882471.94500: ANSIBALLZ: Renaming module 25052 1726882471.94503: ANSIBALLZ: Done creating module 25052 1726882471.94507: variable 'ansible_facts' from source: unknown 25052 1726882471.94509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py 25052 1726882471.94616: Sending initial data 25052 1726882471.94620: Sent initial data (160 bytes) 25052 1726882471.95369: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882471.95515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882471.95598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882471.97287: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882471.97337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882471.97414: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp0_k2y4eh /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py <<< 25052 1726882471.97417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py" <<< 25052 1726882471.97499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp0_k2y4eh" to remote "/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py" <<< 25052 1726882471.98869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882471.98897: stderr chunk (state=3): >>><<< 25052 1726882471.99011: stdout chunk (state=3): >>><<< 25052 1726882471.99037: done transferring module to remote 25052 1726882471.99047: _low_level_execute_command(): starting 25052 1726882471.99052: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/ /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py && sleep 0' 25052 1726882472.00259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882472.00263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882472.00266: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882472.00286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882472.00313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882472.00430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882472.00495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882472.00499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882472.00521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882472.00604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882472.02438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882472.02507: stderr chunk (state=3): >>><<< 25052 1726882472.02510: stdout chunk (state=3): >>><<< 25052 1726882472.02526: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882472.02529: _low_level_execute_command(): starting 25052 1726882472.02538: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/AnsiballZ_service_facts.py && sleep 0' 25052 1726882472.03817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882472.03834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882472.03922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882472.03983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882472.04032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882472.04158: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882472.04688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882473.56477: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 25052 1726882473.56506: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 25052 1726882473.56546: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 25052 1726882473.56566: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25052 1726882473.58072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882473.58084: stdout chunk (state=3): >>><<< 25052 1726882473.58100: stderr chunk (state=3): >>><<< 25052 1726882473.58133: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882473.59119: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882473.59138: _low_level_execute_command(): starting 25052 1726882473.59288: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882471.785835-25486-16548198611720/ > /dev/null 2>&1 && sleep 0' 25052 1726882473.60507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882473.60624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882473.60648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882473.60673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882473.60765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882473.62579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882473.62583: stdout chunk (state=3): >>><<< 25052 1726882473.62585: stderr chunk (state=3): >>><<< 25052 1726882473.62604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882473.62619: handler run complete 25052 1726882473.63402: variable 'ansible_facts' from source: unknown 25052 1726882473.64000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882473.64878: variable 'ansible_facts' from source: unknown 25052 1726882473.65400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882473.65598: attempt loop complete, returning result 25052 1726882473.65705: _execute() done 25052 1726882473.65712: dumping result to json 25052 1726882473.65781: done dumping result, returning 25052 1726882473.66050: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-f7f6-4a6d-000000000201] 25052 1726882473.66053: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000201 25052 1726882473.67502: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000201 25052 1726882473.67505: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882473.67554: no more pending results, returning what we have 25052 1726882473.67557: results queue empty 25052 1726882473.67558: checking for any_errors_fatal 25052 1726882473.67562: done checking for any_errors_fatal 25052 1726882473.67563: checking for max_fail_percentage 25052 1726882473.67564: done checking for max_fail_percentage 25052 1726882473.67565: checking to see if all hosts have failed and the running result is not ok 25052 1726882473.67566: done checking to see if all hosts have failed 25052 1726882473.67567: getting the remaining hosts for this loop 25052 1726882473.67568: done getting the remaining hosts for this loop 25052 1726882473.67571: getting the next task for host managed_node2 25052 1726882473.67578: done getting next task for host managed_node2 25052 1726882473.67581: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25052 1726882473.67584: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882473.67598: getting variables 25052 1726882473.67600: in VariableManager get_vars() 25052 1726882473.67635: Calling all_inventory to load vars for managed_node2 25052 1726882473.67638: Calling groups_inventory to load vars for managed_node2 25052 1726882473.67640: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882473.67649: Calling all_plugins_play to load vars for managed_node2 25052 1726882473.67652: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882473.67654: Calling groups_plugins_play to load vars for managed_node2 25052 1726882473.68578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882473.69702: done with get_vars() 25052 1726882473.69717: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:33 -0400 (0:00:01.952) 0:00:10.654 ****** 25052 1726882473.70015: entering _queue_task() for managed_node2/package_facts 25052 1726882473.70017: Creating lock for package_facts 25052 1726882473.70715: worker is 1 (out of 1 available) 25052 1726882473.70722: exiting _queue_task() for managed_node2/package_facts 25052 1726882473.70732: done queuing things up, now waiting for results queue to drain 25052 1726882473.70734: waiting for pending results... 25052 1726882473.71311: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25052 1726882473.71316: in run() - task 12673a56-9f93-f7f6-4a6d-000000000202 25052 1726882473.71320: variable 'ansible_search_path' from source: unknown 25052 1726882473.71699: variable 'ansible_search_path' from source: unknown 25052 1726882473.71703: calling self._execute() 25052 1726882473.71705: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882473.71708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882473.71710: variable 'omit' from source: magic vars 25052 1726882473.72381: variable 'ansible_distribution_major_version' from source: facts 25052 1726882473.72403: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882473.72487: variable 'omit' from source: magic vars 25052 1726882473.72568: variable 'omit' from source: magic vars 25052 1726882473.72613: variable 'omit' from source: magic vars 25052 1726882473.72657: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882473.72703: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882473.72730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882473.72752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882473.72775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882473.72813: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882473.72824: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882473.72836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882473.72949: Set connection var ansible_pipelining to False 25052 1726882473.72957: Set connection var ansible_connection to ssh 25052 1726882473.72964: Set connection var ansible_shell_type to sh 25052 1726882473.72976: Set connection var ansible_timeout to 10 25052 1726882473.72988: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882473.73000: Set connection var ansible_shell_executable to /bin/sh 25052 1726882473.73026: variable 'ansible_shell_executable' from source: unknown 25052 1726882473.73035: variable 'ansible_connection' from source: unknown 25052 1726882473.73047: variable 'ansible_module_compression' from source: unknown 25052 1726882473.73059: variable 'ansible_shell_type' from source: unknown 25052 1726882473.73066: variable 'ansible_shell_executable' from source: unknown 25052 1726882473.73074: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882473.73081: variable 'ansible_pipelining' from source: unknown 25052 1726882473.73088: variable 'ansible_timeout' from source: unknown 25052 1726882473.73098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882473.73308: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882473.73323: variable 'omit' from source: magic vars 25052 1726882473.73333: starting attempt loop 25052 1726882473.73340: running the handler 25052 1726882473.73357: _low_level_execute_command(): starting 25052 1726882473.73397: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882473.74125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882473.74177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882473.74191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882473.74252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882473.74298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882473.74325: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882473.74372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882473.74586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882473.75999: stdout chunk (state=3): >>>/root <<< 25052 1726882473.76132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882473.76143: stdout chunk (state=3): >>><<< 25052 1726882473.76197: stderr chunk (state=3): >>><<< 25052 1726882473.76216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882473.76234: _low_level_execute_command(): starting 25052 1726882473.76364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105 `" && echo ansible-tmp-1726882473.762227-25577-239592212718105="` echo /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105 `" ) && sleep 0' 25052 1726882473.77516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882473.77520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882473.77522: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882473.77532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882473.77534: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882473.77537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882473.77574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882473.77715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882473.77792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882473.79652: stdout chunk (state=3): >>>ansible-tmp-1726882473.762227-25577-239592212718105=/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105 <<< 25052 1726882473.79813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882473.79816: stdout chunk (state=3): >>><<< 25052 1726882473.79818: stderr chunk (state=3): >>><<< 25052 1726882473.79832: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882473.762227-25577-239592212718105=/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882473.79903: variable 'ansible_module_compression' from source: unknown 25052 1726882473.80105: ANSIBALLZ: Using lock for package_facts 25052 1726882473.80108: ANSIBALLZ: Acquiring lock 25052 1726882473.80111: ANSIBALLZ: Lock acquired: 140207138671744 25052 1726882473.80113: ANSIBALLZ: Creating module 25052 1726882474.17764: ANSIBALLZ: Writing module into payload 25052 1726882474.17920: ANSIBALLZ: Writing module 25052 1726882474.17957: ANSIBALLZ: Renaming module 25052 1726882474.17970: ANSIBALLZ: Done creating module 25052 1726882474.18013: variable 'ansible_facts' from source: unknown 25052 1726882474.18224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py 25052 1726882474.18386: Sending initial data 25052 1726882474.18397: Sent initial data (161 bytes) 25052 1726882474.19064: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882474.19109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882474.19129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882474.19211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882474.19265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882474.19330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882474.20911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882474.20996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882474.21058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpegqwdvnc /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py <<< 25052 1726882474.21062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py" <<< 25052 1726882474.21117: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpegqwdvnc" to remote "/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py" <<< 25052 1726882474.23033: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882474.23037: stdout chunk (state=3): >>><<< 25052 1726882474.23040: stderr chunk (state=3): >>><<< 25052 1726882474.23042: done transferring module to remote 25052 1726882474.23044: _low_level_execute_command(): starting 25052 1726882474.23046: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/ /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py && sleep 0' 25052 1726882474.24039: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882474.24045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882474.24050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882474.24053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882474.24059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882474.24106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882474.24109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882474.24291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882474.24422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882474.26165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882474.26197: stderr chunk (state=3): >>><<< 25052 1726882474.26206: stdout chunk (state=3): >>><<< 25052 1726882474.26298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882474.26302: _low_level_execute_command(): starting 25052 1726882474.26304: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/AnsiballZ_package_facts.py && sleep 0' 25052 1726882474.27461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882474.27485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882474.27498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882474.27711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882474.27809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882474.71669: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 25052 1726882474.71677: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 25052 1726882474.71720: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 25052 1726882474.71732: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 25052 1726882474.71758: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 25052 1726882474.71771: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 25052 1726882474.71798: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 25052 1726882474.71807: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 25052 1726882474.71837: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 25052 1726882474.71843: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 25052 1726882474.71869: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 25052 1726882474.71888: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-m<<< 25052 1726882474.71900: stdout chunk (state=3): >>>apper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25052 1726882474.73636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882474.73639: stdout chunk (state=3): >>><<< 25052 1726882474.73641: stderr chunk (state=3): >>><<< 25052 1726882474.73910: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882474.76449: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882474.76479: _low_level_execute_command(): starting 25052 1726882474.76490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882473.762227-25577-239592212718105/ > /dev/null 2>&1 && sleep 0' 25052 1726882474.77187: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882474.77207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882474.77226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882474.77246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882474.77264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882474.77361: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882474.77376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882474.77468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882474.79305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882474.79335: stderr chunk (state=3): >>><<< 25052 1726882474.79338: stdout chunk (state=3): >>><<< 25052 1726882474.79355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882474.79361: handler run complete 25052 1726882474.79816: variable 'ansible_facts' from source: unknown 25052 1726882474.80090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.81417: variable 'ansible_facts' from source: unknown 25052 1726882474.81687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.82199: attempt loop complete, returning result 25052 1726882474.82208: _execute() done 25052 1726882474.82210: dumping result to json 25052 1726882474.82324: done dumping result, returning 25052 1726882474.82333: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-f7f6-4a6d-000000000202] 25052 1726882474.82338: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000202 25052 1726882474.83685: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000202 25052 1726882474.83688: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882474.83732: no more pending results, returning what we have 25052 1726882474.83735: results queue empty 25052 1726882474.83735: checking for any_errors_fatal 25052 1726882474.83738: done checking for any_errors_fatal 25052 1726882474.83739: checking for max_fail_percentage 25052 1726882474.83740: done checking for max_fail_percentage 25052 1726882474.83740: checking to see if all hosts have failed and the running result is not ok 25052 1726882474.83741: done checking to see if all hosts have failed 25052 1726882474.83741: getting the remaining hosts for this loop 25052 1726882474.83742: done getting the remaining hosts for this loop 25052 1726882474.83744: getting the next task for host managed_node2 25052 1726882474.83748: done getting next task for host managed_node2 25052 1726882474.83751: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25052 1726882474.83753: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882474.83759: getting variables 25052 1726882474.83760: in VariableManager get_vars() 25052 1726882474.83782: Calling all_inventory to load vars for managed_node2 25052 1726882474.83784: Calling groups_inventory to load vars for managed_node2 25052 1726882474.83785: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882474.83795: Calling all_plugins_play to load vars for managed_node2 25052 1726882474.83797: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882474.83799: Calling groups_plugins_play to load vars for managed_node2 25052 1726882474.84996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.86020: done with get_vars() 25052 1726882474.86040: done getting variables 25052 1726882474.86086: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:34 -0400 (0:00:01.160) 0:00:11.815 ****** 25052 1726882474.86118: entering _queue_task() for managed_node2/debug 25052 1726882474.86356: worker is 1 (out of 1 available) 25052 1726882474.86371: exiting _queue_task() for managed_node2/debug 25052 1726882474.86383: done queuing things up, now waiting for results queue to drain 25052 1726882474.86384: waiting for pending results... 25052 1726882474.86559: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 25052 1726882474.86641: in run() - task 12673a56-9f93-f7f6-4a6d-000000000018 25052 1726882474.86653: variable 'ansible_search_path' from source: unknown 25052 1726882474.86656: variable 'ansible_search_path' from source: unknown 25052 1726882474.86683: calling self._execute() 25052 1726882474.86759: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.86765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.86775: variable 'omit' from source: magic vars 25052 1726882474.87300: variable 'ansible_distribution_major_version' from source: facts 25052 1726882474.87304: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882474.87306: variable 'omit' from source: magic vars 25052 1726882474.87308: variable 'omit' from source: magic vars 25052 1726882474.87329: variable 'network_provider' from source: set_fact 25052 1726882474.87350: variable 'omit' from source: magic vars 25052 1726882474.87401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882474.87444: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882474.87467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882474.87495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882474.87517: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882474.87557: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882474.87567: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.87580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.87673: Set connection var ansible_pipelining to False 25052 1726882474.87680: Set connection var ansible_connection to ssh 25052 1726882474.87684: Set connection var ansible_shell_type to sh 25052 1726882474.87686: Set connection var ansible_timeout to 10 25052 1726882474.87711: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882474.87714: Set connection var ansible_shell_executable to /bin/sh 25052 1726882474.87725: variable 'ansible_shell_executable' from source: unknown 25052 1726882474.87728: variable 'ansible_connection' from source: unknown 25052 1726882474.87738: variable 'ansible_module_compression' from source: unknown 25052 1726882474.87744: variable 'ansible_shell_type' from source: unknown 25052 1726882474.87773: variable 'ansible_shell_executable' from source: unknown 25052 1726882474.87777: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.87779: variable 'ansible_pipelining' from source: unknown 25052 1726882474.87781: variable 'ansible_timeout' from source: unknown 25052 1726882474.87783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.87887: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882474.88068: variable 'omit' from source: magic vars 25052 1726882474.88072: starting attempt loop 25052 1726882474.88074: running the handler 25052 1726882474.88077: handler run complete 25052 1726882474.88079: attempt loop complete, returning result 25052 1726882474.88081: _execute() done 25052 1726882474.88083: dumping result to json 25052 1726882474.88085: done dumping result, returning 25052 1726882474.88087: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-f7f6-4a6d-000000000018] 25052 1726882474.88088: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000018 25052 1726882474.88362: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000018 25052 1726882474.88366: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 25052 1726882474.88421: no more pending results, returning what we have 25052 1726882474.88424: results queue empty 25052 1726882474.88425: checking for any_errors_fatal 25052 1726882474.88431: done checking for any_errors_fatal 25052 1726882474.88432: checking for max_fail_percentage 25052 1726882474.88433: done checking for max_fail_percentage 25052 1726882474.88434: checking to see if all hosts have failed and the running result is not ok 25052 1726882474.88435: done checking to see if all hosts have failed 25052 1726882474.88436: getting the remaining hosts for this loop 25052 1726882474.88437: done getting the remaining hosts for this loop 25052 1726882474.88440: getting the next task for host managed_node2 25052 1726882474.88445: done getting next task for host managed_node2 25052 1726882474.88448: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25052 1726882474.88451: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882474.88460: getting variables 25052 1726882474.88462: in VariableManager get_vars() 25052 1726882474.88500: Calling all_inventory to load vars for managed_node2 25052 1726882474.88504: Calling groups_inventory to load vars for managed_node2 25052 1726882474.88506: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882474.88515: Calling all_plugins_play to load vars for managed_node2 25052 1726882474.88517: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882474.88520: Calling groups_plugins_play to load vars for managed_node2 25052 1726882474.89441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.90402: done with get_vars() 25052 1726882474.90418: done getting variables 25052 1726882474.90460: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:34 -0400 (0:00:00.043) 0:00:11.859 ****** 25052 1726882474.90485: entering _queue_task() for managed_node2/fail 25052 1726882474.90735: worker is 1 (out of 1 available) 25052 1726882474.90748: exiting _queue_task() for managed_node2/fail 25052 1726882474.90759: done queuing things up, now waiting for results queue to drain 25052 1726882474.90760: waiting for pending results... 25052 1726882474.91210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25052 1726882474.91215: in run() - task 12673a56-9f93-f7f6-4a6d-000000000019 25052 1726882474.91217: variable 'ansible_search_path' from source: unknown 25052 1726882474.91220: variable 'ansible_search_path' from source: unknown 25052 1726882474.91259: calling self._execute() 25052 1726882474.91347: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.91359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.91372: variable 'omit' from source: magic vars 25052 1726882474.91759: variable 'ansible_distribution_major_version' from source: facts 25052 1726882474.91782: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882474.91860: variable 'network_state' from source: role '' defaults 25052 1726882474.91869: Evaluated conditional (network_state != {}): False 25052 1726882474.91872: when evaluation is False, skipping this task 25052 1726882474.91875: _execute() done 25052 1726882474.91883: dumping result to json 25052 1726882474.91896: done dumping result, returning 25052 1726882474.91900: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-f7f6-4a6d-000000000019] 25052 1726882474.91903: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000019 25052 1726882474.91982: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000019 25052 1726882474.91985: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882474.92047: no more pending results, returning what we have 25052 1726882474.92050: results queue empty 25052 1726882474.92051: checking for any_errors_fatal 25052 1726882474.92057: done checking for any_errors_fatal 25052 1726882474.92058: checking for max_fail_percentage 25052 1726882474.92059: done checking for max_fail_percentage 25052 1726882474.92060: checking to see if all hosts have failed and the running result is not ok 25052 1726882474.92061: done checking to see if all hosts have failed 25052 1726882474.92061: getting the remaining hosts for this loop 25052 1726882474.92063: done getting the remaining hosts for this loop 25052 1726882474.92066: getting the next task for host managed_node2 25052 1726882474.92073: done getting next task for host managed_node2 25052 1726882474.92076: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25052 1726882474.92079: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882474.92097: getting variables 25052 1726882474.92098: in VariableManager get_vars() 25052 1726882474.92132: Calling all_inventory to load vars for managed_node2 25052 1726882474.92135: Calling groups_inventory to load vars for managed_node2 25052 1726882474.92137: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882474.92144: Calling all_plugins_play to load vars for managed_node2 25052 1726882474.92147: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882474.92149: Calling groups_plugins_play to load vars for managed_node2 25052 1726882474.92919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.93773: done with get_vars() 25052 1726882474.93787: done getting variables 25052 1726882474.93829: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:34 -0400 (0:00:00.033) 0:00:11.893 ****** 25052 1726882474.93853: entering _queue_task() for managed_node2/fail 25052 1726882474.94059: worker is 1 (out of 1 available) 25052 1726882474.94072: exiting _queue_task() for managed_node2/fail 25052 1726882474.94084: done queuing things up, now waiting for results queue to drain 25052 1726882474.94085: waiting for pending results... 25052 1726882474.94248: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25052 1726882474.94333: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001a 25052 1726882474.94344: variable 'ansible_search_path' from source: unknown 25052 1726882474.94347: variable 'ansible_search_path' from source: unknown 25052 1726882474.94373: calling self._execute() 25052 1726882474.94441: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.94445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.94455: variable 'omit' from source: magic vars 25052 1726882474.94710: variable 'ansible_distribution_major_version' from source: facts 25052 1726882474.94720: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882474.94802: variable 'network_state' from source: role '' defaults 25052 1726882474.94810: Evaluated conditional (network_state != {}): False 25052 1726882474.94813: when evaluation is False, skipping this task 25052 1726882474.94816: _execute() done 25052 1726882474.94819: dumping result to json 25052 1726882474.94822: done dumping result, returning 25052 1726882474.94829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-f7f6-4a6d-00000000001a] 25052 1726882474.94833: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001a 25052 1726882474.94921: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001a 25052 1726882474.94924: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882474.95000: no more pending results, returning what we have 25052 1726882474.95009: results queue empty 25052 1726882474.95010: checking for any_errors_fatal 25052 1726882474.95016: done checking for any_errors_fatal 25052 1726882474.95017: checking for max_fail_percentage 25052 1726882474.95018: done checking for max_fail_percentage 25052 1726882474.95019: checking to see if all hosts have failed and the running result is not ok 25052 1726882474.95020: done checking to see if all hosts have failed 25052 1726882474.95020: getting the remaining hosts for this loop 25052 1726882474.95021: done getting the remaining hosts for this loop 25052 1726882474.95024: getting the next task for host managed_node2 25052 1726882474.95029: done getting next task for host managed_node2 25052 1726882474.95033: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25052 1726882474.95035: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882474.95048: getting variables 25052 1726882474.95049: in VariableManager get_vars() 25052 1726882474.95076: Calling all_inventory to load vars for managed_node2 25052 1726882474.95077: Calling groups_inventory to load vars for managed_node2 25052 1726882474.95079: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882474.95084: Calling all_plugins_play to load vars for managed_node2 25052 1726882474.95086: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882474.95088: Calling groups_plugins_play to load vars for managed_node2 25052 1726882474.95905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882474.96750: done with get_vars() 25052 1726882474.96764: done getting variables 25052 1726882474.96807: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:34 -0400 (0:00:00.029) 0:00:11.923 ****** 25052 1726882474.96829: entering _queue_task() for managed_node2/fail 25052 1726882474.97058: worker is 1 (out of 1 available) 25052 1726882474.97071: exiting _queue_task() for managed_node2/fail 25052 1726882474.97084: done queuing things up, now waiting for results queue to drain 25052 1726882474.97085: waiting for pending results... 25052 1726882474.97512: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25052 1726882474.97517: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001b 25052 1726882474.97521: variable 'ansible_search_path' from source: unknown 25052 1726882474.97524: variable 'ansible_search_path' from source: unknown 25052 1726882474.97547: calling self._execute() 25052 1726882474.97641: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882474.97656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882474.97673: variable 'omit' from source: magic vars 25052 1726882474.98050: variable 'ansible_distribution_major_version' from source: facts 25052 1726882474.98069: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882474.98255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.00499: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.00586: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.00633: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.00672: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.00712: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.00801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.00909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.00913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.00919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.00942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.01053: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.01076: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25052 1726882475.01201: variable 'ansible_distribution' from source: facts 25052 1726882475.01212: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.01232: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25052 1726882475.01496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.01526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.01566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.01610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.01622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.01658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.01675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.01698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.01721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.01731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.01762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.01779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.01798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.01822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.01832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.02020: variable 'network_connections' from source: task vars 25052 1726882475.02029: variable 'interface' from source: play vars 25052 1726882475.02079: variable 'interface' from source: play vars 25052 1726882475.02089: variable 'network_state' from source: role '' defaults 25052 1726882475.02136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.02245: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.02279: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.02306: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.02328: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.02358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.02373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.02398: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.02417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.02444: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25052 1726882475.02447: when evaluation is False, skipping this task 25052 1726882475.02450: _execute() done 25052 1726882475.02452: dumping result to json 25052 1726882475.02454: done dumping result, returning 25052 1726882475.02462: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-f7f6-4a6d-00000000001b] 25052 1726882475.02465: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001b 25052 1726882475.02547: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001b 25052 1726882475.02550: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25052 1726882475.02602: no more pending results, returning what we have 25052 1726882475.02605: results queue empty 25052 1726882475.02605: checking for any_errors_fatal 25052 1726882475.02611: done checking for any_errors_fatal 25052 1726882475.02612: checking for max_fail_percentage 25052 1726882475.02614: done checking for max_fail_percentage 25052 1726882475.02614: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.02615: done checking to see if all hosts have failed 25052 1726882475.02616: getting the remaining hosts for this loop 25052 1726882475.02617: done getting the remaining hosts for this loop 25052 1726882475.02620: getting the next task for host managed_node2 25052 1726882475.02627: done getting next task for host managed_node2 25052 1726882475.02630: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25052 1726882475.02633: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.02644: getting variables 25052 1726882475.02645: in VariableManager get_vars() 25052 1726882475.02690: Calling all_inventory to load vars for managed_node2 25052 1726882475.02696: Calling groups_inventory to load vars for managed_node2 25052 1726882475.02699: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.02708: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.02710: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.02712: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.03490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.04353: done with get_vars() 25052 1726882475.04369: done getting variables 25052 1726882475.04440: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:35 -0400 (0:00:00.076) 0:00:11.999 ****** 25052 1726882475.04461: entering _queue_task() for managed_node2/dnf 25052 1726882475.04676: worker is 1 (out of 1 available) 25052 1726882475.04689: exiting _queue_task() for managed_node2/dnf 25052 1726882475.04703: done queuing things up, now waiting for results queue to drain 25052 1726882475.04704: waiting for pending results... 25052 1726882475.04866: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25052 1726882475.04948: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001c 25052 1726882475.04958: variable 'ansible_search_path' from source: unknown 25052 1726882475.04961: variable 'ansible_search_path' from source: unknown 25052 1726882475.04988: calling self._execute() 25052 1726882475.05055: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.05059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.05070: variable 'omit' from source: magic vars 25052 1726882475.05331: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.05340: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.05469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.07138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.07179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.07213: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.07245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.07263: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.07325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.07346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.07363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.07387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.07402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.07477: variable 'ansible_distribution' from source: facts 25052 1726882475.07480: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.07492: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25052 1726882475.07568: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.07658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.07674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.07691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.07720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.07731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.07762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.07778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.07797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.07823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.07832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.07863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.07878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.07895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.07921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.07932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.08032: variable 'network_connections' from source: task vars 25052 1726882475.08040: variable 'interface' from source: play vars 25052 1726882475.08090: variable 'interface' from source: play vars 25052 1726882475.08140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.08248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.08274: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.08300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.08323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.08352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.08367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.08387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.08412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.08453: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882475.08614: variable 'network_connections' from source: task vars 25052 1726882475.08619: variable 'interface' from source: play vars 25052 1726882475.08662: variable 'interface' from source: play vars 25052 1726882475.08686: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882475.08689: when evaluation is False, skipping this task 25052 1726882475.08691: _execute() done 25052 1726882475.08695: dumping result to json 25052 1726882475.08701: done dumping result, returning 25052 1726882475.08708: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-00000000001c] 25052 1726882475.08712: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001c 25052 1726882475.08797: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001c 25052 1726882475.08800: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882475.08883: no more pending results, returning what we have 25052 1726882475.08886: results queue empty 25052 1726882475.08887: checking for any_errors_fatal 25052 1726882475.08896: done checking for any_errors_fatal 25052 1726882475.08896: checking for max_fail_percentage 25052 1726882475.08898: done checking for max_fail_percentage 25052 1726882475.08899: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.08900: done checking to see if all hosts have failed 25052 1726882475.08900: getting the remaining hosts for this loop 25052 1726882475.08902: done getting the remaining hosts for this loop 25052 1726882475.08905: getting the next task for host managed_node2 25052 1726882475.08912: done getting next task for host managed_node2 25052 1726882475.08916: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25052 1726882475.08918: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.08931: getting variables 25052 1726882475.08932: in VariableManager get_vars() 25052 1726882475.08966: Calling all_inventory to load vars for managed_node2 25052 1726882475.08968: Calling groups_inventory to load vars for managed_node2 25052 1726882475.08970: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.08978: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.08980: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.08982: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.09824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.10673: done with get_vars() 25052 1726882475.10688: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25052 1726882475.10743: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:35 -0400 (0:00:00.063) 0:00:12.062 ****** 25052 1726882475.10764: entering _queue_task() for managed_node2/yum 25052 1726882475.10765: Creating lock for yum 25052 1726882475.10988: worker is 1 (out of 1 available) 25052 1726882475.11004: exiting _queue_task() for managed_node2/yum 25052 1726882475.11015: done queuing things up, now waiting for results queue to drain 25052 1726882475.11016: waiting for pending results... 25052 1726882475.11179: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25052 1726882475.11266: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001d 25052 1726882475.11277: variable 'ansible_search_path' from source: unknown 25052 1726882475.11281: variable 'ansible_search_path' from source: unknown 25052 1726882475.11317: calling self._execute() 25052 1726882475.11382: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.11386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.11398: variable 'omit' from source: magic vars 25052 1726882475.11651: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.11660: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.11773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.13223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.13272: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.13302: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.13332: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.13351: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.13409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.13432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.13450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.13475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.13485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.13553: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.13565: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25052 1726882475.13568: when evaluation is False, skipping this task 25052 1726882475.13570: _execute() done 25052 1726882475.13573: dumping result to json 25052 1726882475.13576: done dumping result, returning 25052 1726882475.13582: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-00000000001d] 25052 1726882475.13586: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001d 25052 1726882475.13669: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001d 25052 1726882475.13671: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25052 1726882475.13723: no more pending results, returning what we have 25052 1726882475.13726: results queue empty 25052 1726882475.13727: checking for any_errors_fatal 25052 1726882475.13732: done checking for any_errors_fatal 25052 1726882475.13732: checking for max_fail_percentage 25052 1726882475.13734: done checking for max_fail_percentage 25052 1726882475.13735: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.13735: done checking to see if all hosts have failed 25052 1726882475.13736: getting the remaining hosts for this loop 25052 1726882475.13737: done getting the remaining hosts for this loop 25052 1726882475.13740: getting the next task for host managed_node2 25052 1726882475.13746: done getting next task for host managed_node2 25052 1726882475.13750: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25052 1726882475.13752: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.13764: getting variables 25052 1726882475.13766: in VariableManager get_vars() 25052 1726882475.13812: Calling all_inventory to load vars for managed_node2 25052 1726882475.13814: Calling groups_inventory to load vars for managed_node2 25052 1726882475.13816: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.13824: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.13826: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.13828: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.14573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.15517: done with get_vars() 25052 1726882475.15534: done getting variables 25052 1726882475.15572: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:35 -0400 (0:00:00.048) 0:00:12.110 ****** 25052 1726882475.15598: entering _queue_task() for managed_node2/fail 25052 1726882475.15802: worker is 1 (out of 1 available) 25052 1726882475.15815: exiting _queue_task() for managed_node2/fail 25052 1726882475.15825: done queuing things up, now waiting for results queue to drain 25052 1726882475.15827: waiting for pending results... 25052 1726882475.15985: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25052 1726882475.16071: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001e 25052 1726882475.16087: variable 'ansible_search_path' from source: unknown 25052 1726882475.16097: variable 'ansible_search_path' from source: unknown 25052 1726882475.16120: calling self._execute() 25052 1726882475.16188: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.16196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.16200: variable 'omit' from source: magic vars 25052 1726882475.16447: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.16455: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.16536: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.16658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.18069: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.18120: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.18149: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.18173: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.18191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.18250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.18271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.18288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.18316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.18328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.18362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.18379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.18397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.18422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.18432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.18464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.18479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.18498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.18522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.18532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.18642: variable 'network_connections' from source: task vars 25052 1726882475.18651: variable 'interface' from source: play vars 25052 1726882475.18705: variable 'interface' from source: play vars 25052 1726882475.18751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.18857: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.18883: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.18918: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.18940: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.18969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.18984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.19008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.19026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.19067: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882475.19217: variable 'network_connections' from source: task vars 25052 1726882475.19223: variable 'interface' from source: play vars 25052 1726882475.19266: variable 'interface' from source: play vars 25052 1726882475.19294: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882475.19298: when evaluation is False, skipping this task 25052 1726882475.19301: _execute() done 25052 1726882475.19306: dumping result to json 25052 1726882475.19308: done dumping result, returning 25052 1726882475.19311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-00000000001e] 25052 1726882475.19316: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001e 25052 1726882475.19402: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001e 25052 1726882475.19405: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882475.19466: no more pending results, returning what we have 25052 1726882475.19470: results queue empty 25052 1726882475.19470: checking for any_errors_fatal 25052 1726882475.19476: done checking for any_errors_fatal 25052 1726882475.19477: checking for max_fail_percentage 25052 1726882475.19478: done checking for max_fail_percentage 25052 1726882475.19479: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.19480: done checking to see if all hosts have failed 25052 1726882475.19480: getting the remaining hosts for this loop 25052 1726882475.19482: done getting the remaining hosts for this loop 25052 1726882475.19485: getting the next task for host managed_node2 25052 1726882475.19490: done getting next task for host managed_node2 25052 1726882475.19497: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25052 1726882475.19500: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.19512: getting variables 25052 1726882475.19513: in VariableManager get_vars() 25052 1726882475.19545: Calling all_inventory to load vars for managed_node2 25052 1726882475.19547: Calling groups_inventory to load vars for managed_node2 25052 1726882475.19549: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.19556: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.19559: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.19561: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.20308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.21170: done with get_vars() 25052 1726882475.21188: done getting variables 25052 1726882475.21237: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:35 -0400 (0:00:00.056) 0:00:12.167 ****** 25052 1726882475.21261: entering _queue_task() for managed_node2/package 25052 1726882475.21503: worker is 1 (out of 1 available) 25052 1726882475.21515: exiting _queue_task() for managed_node2/package 25052 1726882475.21527: done queuing things up, now waiting for results queue to drain 25052 1726882475.21529: waiting for pending results... 25052 1726882475.21690: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 25052 1726882475.21783: in run() - task 12673a56-9f93-f7f6-4a6d-00000000001f 25052 1726882475.21795: variable 'ansible_search_path' from source: unknown 25052 1726882475.21802: variable 'ansible_search_path' from source: unknown 25052 1726882475.21830: calling self._execute() 25052 1726882475.21891: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.21901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.21909: variable 'omit' from source: magic vars 25052 1726882475.22165: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.22174: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.22312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.22502: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.22537: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.22562: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.22585: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.22664: variable 'network_packages' from source: role '' defaults 25052 1726882475.22736: variable '__network_provider_setup' from source: role '' defaults 25052 1726882475.22743: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882475.22795: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882475.22805: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882475.22852: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882475.22962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.24269: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.24311: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.24345: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.24372: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.24392: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.24449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.24472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.24491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.24520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.24531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.24561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.24579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.24601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.24625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.24635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.24771: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25052 1726882475.24846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.24863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.24879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.24910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.24920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.24978: variable 'ansible_python' from source: facts 25052 1726882475.25000: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25052 1726882475.25055: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882475.25112: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882475.25191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.25211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.25232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.25255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.25266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.25301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.25320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.25340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.25363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.25374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.25467: variable 'network_connections' from source: task vars 25052 1726882475.25471: variable 'interface' from source: play vars 25052 1726882475.25543: variable 'interface' from source: play vars 25052 1726882475.25594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.25615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.25636: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.25658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.25692: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.25864: variable 'network_connections' from source: task vars 25052 1726882475.25867: variable 'interface' from source: play vars 25052 1726882475.25941: variable 'interface' from source: play vars 25052 1726882475.25978: variable '__network_packages_default_wireless' from source: role '' defaults 25052 1726882475.26037: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.26226: variable 'network_connections' from source: task vars 25052 1726882475.26229: variable 'interface' from source: play vars 25052 1726882475.26273: variable 'interface' from source: play vars 25052 1726882475.26292: variable '__network_packages_default_team' from source: role '' defaults 25052 1726882475.26349: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882475.26540: variable 'network_connections' from source: task vars 25052 1726882475.26543: variable 'interface' from source: play vars 25052 1726882475.26586: variable 'interface' from source: play vars 25052 1726882475.26631: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882475.26675: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882475.26680: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882475.26725: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882475.26863: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25052 1726882475.27387: variable 'network_connections' from source: task vars 25052 1726882475.27390: variable 'interface' from source: play vars 25052 1726882475.27439: variable 'interface' from source: play vars 25052 1726882475.27447: variable 'ansible_distribution' from source: facts 25052 1726882475.27450: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.27456: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.27472: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25052 1726882475.27580: variable 'ansible_distribution' from source: facts 25052 1726882475.27583: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.27587: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.27602: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25052 1726882475.27711: variable 'ansible_distribution' from source: facts 25052 1726882475.27714: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.27717: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.27747: variable 'network_provider' from source: set_fact 25052 1726882475.27757: variable 'ansible_facts' from source: unknown 25052 1726882475.30915: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25052 1726882475.30919: when evaluation is False, skipping this task 25052 1726882475.30922: _execute() done 25052 1726882475.30925: dumping result to json 25052 1726882475.30927: done dumping result, returning 25052 1726882475.30930: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-f7f6-4a6d-00000000001f] 25052 1726882475.30932: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001f 25052 1726882475.31016: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000001f 25052 1726882475.31020: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25052 1726882475.31063: no more pending results, returning what we have 25052 1726882475.31066: results queue empty 25052 1726882475.31066: checking for any_errors_fatal 25052 1726882475.31073: done checking for any_errors_fatal 25052 1726882475.31073: checking for max_fail_percentage 25052 1726882475.31075: done checking for max_fail_percentage 25052 1726882475.31075: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.31076: done checking to see if all hosts have failed 25052 1726882475.31077: getting the remaining hosts for this loop 25052 1726882475.31078: done getting the remaining hosts for this loop 25052 1726882475.31081: getting the next task for host managed_node2 25052 1726882475.31087: done getting next task for host managed_node2 25052 1726882475.31090: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25052 1726882475.31095: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.31107: getting variables 25052 1726882475.31108: in VariableManager get_vars() 25052 1726882475.31144: Calling all_inventory to load vars for managed_node2 25052 1726882475.31146: Calling groups_inventory to load vars for managed_node2 25052 1726882475.31148: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.31157: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.31159: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.31162: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.32099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.35229: done with get_vars() 25052 1726882475.35245: done getting variables 25052 1726882475.35278: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:35 -0400 (0:00:00.140) 0:00:12.307 ****** 25052 1726882475.35299: entering _queue_task() for managed_node2/package 25052 1726882475.35545: worker is 1 (out of 1 available) 25052 1726882475.35558: exiting _queue_task() for managed_node2/package 25052 1726882475.35569: done queuing things up, now waiting for results queue to drain 25052 1726882475.35571: waiting for pending results... 25052 1726882475.35746: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25052 1726882475.35842: in run() - task 12673a56-9f93-f7f6-4a6d-000000000020 25052 1726882475.35854: variable 'ansible_search_path' from source: unknown 25052 1726882475.35857: variable 'ansible_search_path' from source: unknown 25052 1726882475.35882: calling self._execute() 25052 1726882475.35951: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.35955: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.35964: variable 'omit' from source: magic vars 25052 1726882475.36238: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.36242: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.36323: variable 'network_state' from source: role '' defaults 25052 1726882475.36332: Evaluated conditional (network_state != {}): False 25052 1726882475.36335: when evaluation is False, skipping this task 25052 1726882475.36340: _execute() done 25052 1726882475.36342: dumping result to json 25052 1726882475.36344: done dumping result, returning 25052 1726882475.36410: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-f7f6-4a6d-000000000020] 25052 1726882475.36413: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000020 25052 1726882475.36477: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000020 25052 1726882475.36480: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882475.36528: no more pending results, returning what we have 25052 1726882475.36531: results queue empty 25052 1726882475.36532: checking for any_errors_fatal 25052 1726882475.36541: done checking for any_errors_fatal 25052 1726882475.36542: checking for max_fail_percentage 25052 1726882475.36543: done checking for max_fail_percentage 25052 1726882475.36544: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.36545: done checking to see if all hosts have failed 25052 1726882475.36545: getting the remaining hosts for this loop 25052 1726882475.36547: done getting the remaining hosts for this loop 25052 1726882475.36550: getting the next task for host managed_node2 25052 1726882475.36556: done getting next task for host managed_node2 25052 1726882475.36559: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25052 1726882475.36562: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.36575: getting variables 25052 1726882475.36576: in VariableManager get_vars() 25052 1726882475.36613: Calling all_inventory to load vars for managed_node2 25052 1726882475.36615: Calling groups_inventory to load vars for managed_node2 25052 1726882475.36617: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.36625: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.36627: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.36629: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.37350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.38439: done with get_vars() 25052 1726882475.38459: done getting variables 25052 1726882475.38520: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:35 -0400 (0:00:00.032) 0:00:12.340 ****** 25052 1726882475.38550: entering _queue_task() for managed_node2/package 25052 1726882475.38836: worker is 1 (out of 1 available) 25052 1726882475.38849: exiting _queue_task() for managed_node2/package 25052 1726882475.38862: done queuing things up, now waiting for results queue to drain 25052 1726882475.38863: waiting for pending results... 25052 1726882475.39128: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25052 1726882475.39218: in run() - task 12673a56-9f93-f7f6-4a6d-000000000021 25052 1726882475.39229: variable 'ansible_search_path' from source: unknown 25052 1726882475.39233: variable 'ansible_search_path' from source: unknown 25052 1726882475.39261: calling self._execute() 25052 1726882475.39337: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.39341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.39353: variable 'omit' from source: magic vars 25052 1726882475.39622: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.39632: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.39716: variable 'network_state' from source: role '' defaults 25052 1726882475.39724: Evaluated conditional (network_state != {}): False 25052 1726882475.39727: when evaluation is False, skipping this task 25052 1726882475.39729: _execute() done 25052 1726882475.39732: dumping result to json 25052 1726882475.39735: done dumping result, returning 25052 1726882475.39742: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-f7f6-4a6d-000000000021] 25052 1726882475.39746: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000021 25052 1726882475.39834: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000021 25052 1726882475.39837: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882475.39879: no more pending results, returning what we have 25052 1726882475.39882: results queue empty 25052 1726882475.39883: checking for any_errors_fatal 25052 1726882475.39890: done checking for any_errors_fatal 25052 1726882475.39890: checking for max_fail_percentage 25052 1726882475.39892: done checking for max_fail_percentage 25052 1726882475.39894: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.39895: done checking to see if all hosts have failed 25052 1726882475.39896: getting the remaining hosts for this loop 25052 1726882475.39897: done getting the remaining hosts for this loop 25052 1726882475.39900: getting the next task for host managed_node2 25052 1726882475.39907: done getting next task for host managed_node2 25052 1726882475.39911: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25052 1726882475.39914: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.39928: getting variables 25052 1726882475.39929: in VariableManager get_vars() 25052 1726882475.39963: Calling all_inventory to load vars for managed_node2 25052 1726882475.39966: Calling groups_inventory to load vars for managed_node2 25052 1726882475.39968: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.39976: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.39978: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.39980: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.40945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.42951: done with get_vars() 25052 1726882475.42986: done getting variables 25052 1726882475.43102: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:35 -0400 (0:00:00.045) 0:00:12.386 ****** 25052 1726882475.43136: entering _queue_task() for managed_node2/service 25052 1726882475.43137: Creating lock for service 25052 1726882475.43504: worker is 1 (out of 1 available) 25052 1726882475.43605: exiting _queue_task() for managed_node2/service 25052 1726882475.43616: done queuing things up, now waiting for results queue to drain 25052 1726882475.43617: waiting for pending results... 25052 1726882475.43916: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25052 1726882475.44000: in run() - task 12673a56-9f93-f7f6-4a6d-000000000022 25052 1726882475.44100: variable 'ansible_search_path' from source: unknown 25052 1726882475.44103: variable 'ansible_search_path' from source: unknown 25052 1726882475.44106: calling self._execute() 25052 1726882475.44156: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.44169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.44183: variable 'omit' from source: magic vars 25052 1726882475.44558: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.44573: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.44768: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.44900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.47223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.47311: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.47351: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.47398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.47428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.47517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.47550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.47578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.47698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.47704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.47706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.47731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.47758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.47802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.47827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.47871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.47931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.47934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.47975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.47997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.48180: variable 'network_connections' from source: task vars 25052 1726882475.48256: variable 'interface' from source: play vars 25052 1726882475.48283: variable 'interface' from source: play vars 25052 1726882475.48363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.48546: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.48606: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.48639: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.48703: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.48724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.48753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.48782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.48823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.48920: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882475.49147: variable 'network_connections' from source: task vars 25052 1726882475.49156: variable 'interface' from source: play vars 25052 1726882475.49226: variable 'interface' from source: play vars 25052 1726882475.49268: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882475.49277: when evaluation is False, skipping this task 25052 1726882475.49283: _execute() done 25052 1726882475.49498: dumping result to json 25052 1726882475.49501: done dumping result, returning 25052 1726882475.49504: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-000000000022] 25052 1726882475.49506: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000022 25052 1726882475.49576: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000022 25052 1726882475.49585: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882475.49637: no more pending results, returning what we have 25052 1726882475.49641: results queue empty 25052 1726882475.49642: checking for any_errors_fatal 25052 1726882475.49647: done checking for any_errors_fatal 25052 1726882475.49648: checking for max_fail_percentage 25052 1726882475.49650: done checking for max_fail_percentage 25052 1726882475.49650: checking to see if all hosts have failed and the running result is not ok 25052 1726882475.49651: done checking to see if all hosts have failed 25052 1726882475.49652: getting the remaining hosts for this loop 25052 1726882475.49653: done getting the remaining hosts for this loop 25052 1726882475.49657: getting the next task for host managed_node2 25052 1726882475.49663: done getting next task for host managed_node2 25052 1726882475.49668: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25052 1726882475.49670: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882475.49684: getting variables 25052 1726882475.49686: in VariableManager get_vars() 25052 1726882475.49732: Calling all_inventory to load vars for managed_node2 25052 1726882475.49735: Calling groups_inventory to load vars for managed_node2 25052 1726882475.49737: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882475.49747: Calling all_plugins_play to load vars for managed_node2 25052 1726882475.49750: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882475.49752: Calling groups_plugins_play to load vars for managed_node2 25052 1726882475.51298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882475.52970: done with get_vars() 25052 1726882475.52995: done getting variables 25052 1726882475.53054: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:35 -0400 (0:00:00.099) 0:00:12.485 ****** 25052 1726882475.53084: entering _queue_task() for managed_node2/service 25052 1726882475.53499: worker is 1 (out of 1 available) 25052 1726882475.53510: exiting _queue_task() for managed_node2/service 25052 1726882475.53520: done queuing things up, now waiting for results queue to drain 25052 1726882475.53521: waiting for pending results... 25052 1726882475.53721: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25052 1726882475.53854: in run() - task 12673a56-9f93-f7f6-4a6d-000000000023 25052 1726882475.53858: variable 'ansible_search_path' from source: unknown 25052 1726882475.53860: variable 'ansible_search_path' from source: unknown 25052 1726882475.53895: calling self._execute() 25052 1726882475.54035: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.54039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.54042: variable 'omit' from source: magic vars 25052 1726882475.54396: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.54415: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882475.54588: variable 'network_provider' from source: set_fact 25052 1726882475.54603: variable 'network_state' from source: role '' defaults 25052 1726882475.54689: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25052 1726882475.54697: variable 'omit' from source: magic vars 25052 1726882475.54699: variable 'omit' from source: magic vars 25052 1726882475.54728: variable 'network_service_name' from source: role '' defaults 25052 1726882475.54806: variable 'network_service_name' from source: role '' defaults 25052 1726882475.54925: variable '__network_provider_setup' from source: role '' defaults 25052 1726882475.54942: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882475.55020: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882475.55034: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882475.55107: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882475.55382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882475.57861: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882475.57946: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882475.58005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882475.58072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882475.58078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882475.58156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.58200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.58290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.58297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.58299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.58337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.58363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.58399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.58443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.58462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.58681: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25052 1726882475.58816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.58999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.59002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.59005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.59007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.59036: variable 'ansible_python' from source: facts 25052 1726882475.59064: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25052 1726882475.59163: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882475.59256: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882475.59399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.59429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.59465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.59513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.59533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.59596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882475.59670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882475.59673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.59719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882475.59739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882475.60002: variable 'network_connections' from source: task vars 25052 1726882475.60006: variable 'interface' from source: play vars 25052 1726882475.60008: variable 'interface' from source: play vars 25052 1726882475.60097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882475.60303: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882475.60362: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882475.60417: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882475.60468: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882475.60531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882475.60570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882475.60610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882475.60652: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882475.60708: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.61007: variable 'network_connections' from source: task vars 25052 1726882475.61020: variable 'interface' from source: play vars 25052 1726882475.61109: variable 'interface' from source: play vars 25052 1726882475.61161: variable '__network_packages_default_wireless' from source: role '' defaults 25052 1726882475.61250: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882475.61548: variable 'network_connections' from source: task vars 25052 1726882475.61599: variable 'interface' from source: play vars 25052 1726882475.61643: variable 'interface' from source: play vars 25052 1726882475.61673: variable '__network_packages_default_team' from source: role '' defaults 25052 1726882475.61766: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882475.62061: variable 'network_connections' from source: task vars 25052 1726882475.62078: variable 'interface' from source: play vars 25052 1726882475.62151: variable 'interface' from source: play vars 25052 1726882475.62301: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882475.62305: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882475.62307: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882475.62367: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882475.62588: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25052 1726882475.63079: variable 'network_connections' from source: task vars 25052 1726882475.63089: variable 'interface' from source: play vars 25052 1726882475.63153: variable 'interface' from source: play vars 25052 1726882475.63175: variable 'ansible_distribution' from source: facts 25052 1726882475.63183: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.63197: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.63224: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25052 1726882475.63414: variable 'ansible_distribution' from source: facts 25052 1726882475.63423: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.63598: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.63602: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25052 1726882475.63609: variable 'ansible_distribution' from source: facts 25052 1726882475.63619: variable '__network_rh_distros' from source: role '' defaults 25052 1726882475.63629: variable 'ansible_distribution_major_version' from source: facts 25052 1726882475.63669: variable 'network_provider' from source: set_fact 25052 1726882475.63701: variable 'omit' from source: magic vars 25052 1726882475.63739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882475.63772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882475.63800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882475.63828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882475.63844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882475.63878: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882475.63887: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.63899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.64010: Set connection var ansible_pipelining to False 25052 1726882475.64020: Set connection var ansible_connection to ssh 25052 1726882475.64028: Set connection var ansible_shell_type to sh 25052 1726882475.64049: Set connection var ansible_timeout to 10 25052 1726882475.64153: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882475.64156: Set connection var ansible_shell_executable to /bin/sh 25052 1726882475.64159: variable 'ansible_shell_executable' from source: unknown 25052 1726882475.64161: variable 'ansible_connection' from source: unknown 25052 1726882475.64163: variable 'ansible_module_compression' from source: unknown 25052 1726882475.64165: variable 'ansible_shell_type' from source: unknown 25052 1726882475.64166: variable 'ansible_shell_executable' from source: unknown 25052 1726882475.64168: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882475.64170: variable 'ansible_pipelining' from source: unknown 25052 1726882475.64172: variable 'ansible_timeout' from source: unknown 25052 1726882475.64174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882475.64243: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882475.64265: variable 'omit' from source: magic vars 25052 1726882475.64280: starting attempt loop 25052 1726882475.64288: running the handler 25052 1726882475.64372: variable 'ansible_facts' from source: unknown 25052 1726882475.65148: _low_level_execute_command(): starting 25052 1726882475.65158: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882475.65764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882475.65775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882475.65883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882475.65887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882475.65985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882475.67685: stdout chunk (state=3): >>>/root <<< 25052 1726882475.67816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882475.67834: stderr chunk (state=3): >>><<< 25052 1726882475.67845: stdout chunk (state=3): >>><<< 25052 1726882475.67902: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882475.67905: _low_level_execute_command(): starting 25052 1726882475.67910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070 `" && echo ansible-tmp-1726882475.678768-25649-257161988082070="` echo /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070 `" ) && sleep 0' 25052 1726882475.68499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882475.68503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.68529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882475.68532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.68582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882475.68586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882475.68590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882475.68654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882475.70527: stdout chunk (state=3): >>>ansible-tmp-1726882475.678768-25649-257161988082070=/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070 <<< 25052 1726882475.70636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882475.70659: stderr chunk (state=3): >>><<< 25052 1726882475.70662: stdout chunk (state=3): >>><<< 25052 1726882475.70678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882475.678768-25649-257161988082070=/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882475.70706: variable 'ansible_module_compression' from source: unknown 25052 1726882475.70746: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 25052 1726882475.70750: ANSIBALLZ: Acquiring lock 25052 1726882475.70753: ANSIBALLZ: Lock acquired: 140207139645744 25052 1726882475.70755: ANSIBALLZ: Creating module 25052 1726882475.88834: ANSIBALLZ: Writing module into payload 25052 1726882475.88940: ANSIBALLZ: Writing module 25052 1726882475.88963: ANSIBALLZ: Renaming module 25052 1726882475.88969: ANSIBALLZ: Done creating module 25052 1726882475.88988: variable 'ansible_facts' from source: unknown 25052 1726882475.89105: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py 25052 1726882475.89207: Sending initial data 25052 1726882475.89211: Sent initial data (155 bytes) 25052 1726882475.89666: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882475.89669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882475.89676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882475.89678: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882475.89681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.89731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882475.89734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882475.89805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882475.91352: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882475.91414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882475.91474: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmporxdgiva /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py <<< 25052 1726882475.91481: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py" <<< 25052 1726882475.91534: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmporxdgiva" to remote "/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py" <<< 25052 1726882475.92701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882475.92753: stderr chunk (state=3): >>><<< 25052 1726882475.92756: stdout chunk (state=3): >>><<< 25052 1726882475.92784: done transferring module to remote 25052 1726882475.92797: _low_level_execute_command(): starting 25052 1726882475.92800: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/ /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py && sleep 0' 25052 1726882475.93242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882475.93246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882475.93248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.93250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882475.93252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.93350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882475.93357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882475.93434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882475.95166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882475.95186: stderr chunk (state=3): >>><<< 25052 1726882475.95189: stdout chunk (state=3): >>><<< 25052 1726882475.95208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882475.95211: _low_level_execute_command(): starting 25052 1726882475.95216: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/AnsiballZ_systemd.py && sleep 0' 25052 1726882475.95698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882475.95702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.95704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882475.95706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882475.95777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882475.95839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882476.24750: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312222208", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1139919000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 25052 1726882476.24759: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25052 1726882476.26282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882476.26313: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 25052 1726882476.26358: stderr chunk (state=3): >>><<< 25052 1726882476.26413: stdout chunk (state=3): >>><<< 25052 1726882476.26801: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3312222208", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1139919000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882476.26910: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882476.26932: _low_level_execute_command(): starting 25052 1726882476.26941: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882475.678768-25649-257161988082070/ > /dev/null 2>&1 && sleep 0' 25052 1726882476.27615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882476.27705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882476.27711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882476.27746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882476.27757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882476.27765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882476.27937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882476.29745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882476.29899: stderr chunk (state=3): >>><<< 25052 1726882476.29903: stdout chunk (state=3): >>><<< 25052 1726882476.29905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882476.29908: handler run complete 25052 1726882476.29930: attempt loop complete, returning result 25052 1726882476.29933: _execute() done 25052 1726882476.29936: dumping result to json 25052 1726882476.29958: done dumping result, returning 25052 1726882476.29968: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-f7f6-4a6d-000000000023] 25052 1726882476.30173: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000023 25052 1726882476.30589: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000023 25052 1726882476.30595: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882476.30640: no more pending results, returning what we have 25052 1726882476.30644: results queue empty 25052 1726882476.30644: checking for any_errors_fatal 25052 1726882476.30651: done checking for any_errors_fatal 25052 1726882476.30652: checking for max_fail_percentage 25052 1726882476.30654: done checking for max_fail_percentage 25052 1726882476.30655: checking to see if all hosts have failed and the running result is not ok 25052 1726882476.30655: done checking to see if all hosts have failed 25052 1726882476.30656: getting the remaining hosts for this loop 25052 1726882476.30657: done getting the remaining hosts for this loop 25052 1726882476.30661: getting the next task for host managed_node2 25052 1726882476.30668: done getting next task for host managed_node2 25052 1726882476.30671: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25052 1726882476.30674: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882476.30685: getting variables 25052 1726882476.30687: in VariableManager get_vars() 25052 1726882476.30731: Calling all_inventory to load vars for managed_node2 25052 1726882476.30734: Calling groups_inventory to load vars for managed_node2 25052 1726882476.30737: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882476.30747: Calling all_plugins_play to load vars for managed_node2 25052 1726882476.30750: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882476.30753: Calling groups_plugins_play to load vars for managed_node2 25052 1726882476.33456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882476.36645: done with get_vars() 25052 1726882476.36674: done getting variables 25052 1726882476.36940: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:36 -0400 (0:00:00.838) 0:00:13.324 ****** 25052 1726882476.36976: entering _queue_task() for managed_node2/service 25052 1726882476.37729: worker is 1 (out of 1 available) 25052 1726882476.37739: exiting _queue_task() for managed_node2/service 25052 1726882476.37749: done queuing things up, now waiting for results queue to drain 25052 1726882476.37750: waiting for pending results... 25052 1726882476.38313: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25052 1726882476.38353: in run() - task 12673a56-9f93-f7f6-4a6d-000000000024 25052 1726882476.38446: variable 'ansible_search_path' from source: unknown 25052 1726882476.38450: variable 'ansible_search_path' from source: unknown 25052 1726882476.38663: calling self._execute() 25052 1726882476.38704: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.38717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.38736: variable 'omit' from source: magic vars 25052 1726882476.39537: variable 'ansible_distribution_major_version' from source: facts 25052 1726882476.39557: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882476.39899: variable 'network_provider' from source: set_fact 25052 1726882476.39902: Evaluated conditional (network_provider == "nm"): True 25052 1726882476.40190: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882476.40203: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882476.40582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882476.44835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882476.45059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882476.45299: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882476.45302: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882476.45308: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882476.45474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882476.45515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882476.45630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882476.45680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882476.45768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882476.45824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882476.46075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882476.46079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882476.46082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882476.46298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882476.46301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882476.46304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882476.46306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882476.46429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882476.46448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882476.46800: variable 'network_connections' from source: task vars 25052 1726882476.46820: variable 'interface' from source: play vars 25052 1726882476.46955: variable 'interface' from source: play vars 25052 1726882476.47040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882476.47445: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882476.47527: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882476.47633: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882476.47829: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882476.48000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882476.48003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882476.48005: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882476.48008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882476.48010: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882476.48547: variable 'network_connections' from source: task vars 25052 1726882476.48565: variable 'interface' from source: play vars 25052 1726882476.48730: variable 'interface' from source: play vars 25052 1726882476.48998: Evaluated conditional (__network_wpa_supplicant_required): False 25052 1726882476.49002: when evaluation is False, skipping this task 25052 1726882476.49005: _execute() done 25052 1726882476.49008: dumping result to json 25052 1726882476.49011: done dumping result, returning 25052 1726882476.49014: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-f7f6-4a6d-000000000024] 25052 1726882476.49025: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000024 25052 1726882476.49096: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000024 25052 1726882476.49100: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25052 1726882476.49177: no more pending results, returning what we have 25052 1726882476.49180: results queue empty 25052 1726882476.49181: checking for any_errors_fatal 25052 1726882476.49207: done checking for any_errors_fatal 25052 1726882476.49208: checking for max_fail_percentage 25052 1726882476.49210: done checking for max_fail_percentage 25052 1726882476.49211: checking to see if all hosts have failed and the running result is not ok 25052 1726882476.49212: done checking to see if all hosts have failed 25052 1726882476.49212: getting the remaining hosts for this loop 25052 1726882476.49214: done getting the remaining hosts for this loop 25052 1726882476.49217: getting the next task for host managed_node2 25052 1726882476.49225: done getting next task for host managed_node2 25052 1726882476.49228: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25052 1726882476.49231: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882476.49243: getting variables 25052 1726882476.49244: in VariableManager get_vars() 25052 1726882476.49283: Calling all_inventory to load vars for managed_node2 25052 1726882476.49285: Calling groups_inventory to load vars for managed_node2 25052 1726882476.49287: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882476.49400: Calling all_plugins_play to load vars for managed_node2 25052 1726882476.49404: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882476.49408: Calling groups_plugins_play to load vars for managed_node2 25052 1726882476.52318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882476.55502: done with get_vars() 25052 1726882476.55529: done getting variables 25052 1726882476.55589: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:36 -0400 (0:00:00.186) 0:00:13.511 ****** 25052 1726882476.55623: entering _queue_task() for managed_node2/service 25052 1726882476.56364: worker is 1 (out of 1 available) 25052 1726882476.56375: exiting _queue_task() for managed_node2/service 25052 1726882476.56386: done queuing things up, now waiting for results queue to drain 25052 1726882476.56387: waiting for pending results... 25052 1726882476.56922: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 25052 1726882476.57127: in run() - task 12673a56-9f93-f7f6-4a6d-000000000025 25052 1726882476.57148: variable 'ansible_search_path' from source: unknown 25052 1726882476.57235: variable 'ansible_search_path' from source: unknown 25052 1726882476.57453: calling self._execute() 25052 1726882476.57456: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.57459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.57669: variable 'omit' from source: magic vars 25052 1726882476.58349: variable 'ansible_distribution_major_version' from source: facts 25052 1726882476.58457: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882476.58489: variable 'network_provider' from source: set_fact 25052 1726882476.58573: Evaluated conditional (network_provider == "initscripts"): False 25052 1726882476.58582: when evaluation is False, skipping this task 25052 1726882476.58589: _execute() done 25052 1726882476.58602: dumping result to json 25052 1726882476.58610: done dumping result, returning 25052 1726882476.58621: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-f7f6-4a6d-000000000025] 25052 1726882476.58631: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000025 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882476.58774: no more pending results, returning what we have 25052 1726882476.58779: results queue empty 25052 1726882476.58779: checking for any_errors_fatal 25052 1726882476.58797: done checking for any_errors_fatal 25052 1726882476.58798: checking for max_fail_percentage 25052 1726882476.58800: done checking for max_fail_percentage 25052 1726882476.58801: checking to see if all hosts have failed and the running result is not ok 25052 1726882476.58802: done checking to see if all hosts have failed 25052 1726882476.58803: getting the remaining hosts for this loop 25052 1726882476.58804: done getting the remaining hosts for this loop 25052 1726882476.58809: getting the next task for host managed_node2 25052 1726882476.58817: done getting next task for host managed_node2 25052 1726882476.58823: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25052 1726882476.58826: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882476.58841: getting variables 25052 1726882476.58843: in VariableManager get_vars() 25052 1726882476.58882: Calling all_inventory to load vars for managed_node2 25052 1726882476.58885: Calling groups_inventory to load vars for managed_node2 25052 1726882476.58887: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882476.59099: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000025 25052 1726882476.59102: WORKER PROCESS EXITING 25052 1726882476.59113: Calling all_plugins_play to load vars for managed_node2 25052 1726882476.59116: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882476.59118: Calling groups_plugins_play to load vars for managed_node2 25052 1726882476.61566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882476.64564: done with get_vars() 25052 1726882476.64588: done getting variables 25052 1726882476.64648: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:36 -0400 (0:00:00.090) 0:00:13.601 ****** 25052 1726882476.64681: entering _queue_task() for managed_node2/copy 25052 1726882476.65415: worker is 1 (out of 1 available) 25052 1726882476.65429: exiting _queue_task() for managed_node2/copy 25052 1726882476.65440: done queuing things up, now waiting for results queue to drain 25052 1726882476.65441: waiting for pending results... 25052 1726882476.65971: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25052 1726882476.66211: in run() - task 12673a56-9f93-f7f6-4a6d-000000000026 25052 1726882476.66230: variable 'ansible_search_path' from source: unknown 25052 1726882476.66242: variable 'ansible_search_path' from source: unknown 25052 1726882476.66280: calling self._execute() 25052 1726882476.66566: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.66570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.66574: variable 'omit' from source: magic vars 25052 1726882476.67290: variable 'ansible_distribution_major_version' from source: facts 25052 1726882476.67413: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882476.67538: variable 'network_provider' from source: set_fact 25052 1726882476.67699: Evaluated conditional (network_provider == "initscripts"): False 25052 1726882476.67702: when evaluation is False, skipping this task 25052 1726882476.67705: _execute() done 25052 1726882476.67707: dumping result to json 25052 1726882476.67710: done dumping result, returning 25052 1726882476.67713: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-f7f6-4a6d-000000000026] 25052 1726882476.67716: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000026 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25052 1726882476.67859: no more pending results, returning what we have 25052 1726882476.67863: results queue empty 25052 1726882476.67864: checking for any_errors_fatal 25052 1726882476.67872: done checking for any_errors_fatal 25052 1726882476.67873: checking for max_fail_percentage 25052 1726882476.67875: done checking for max_fail_percentage 25052 1726882476.67876: checking to see if all hosts have failed and the running result is not ok 25052 1726882476.67877: done checking to see if all hosts have failed 25052 1726882476.67878: getting the remaining hosts for this loop 25052 1726882476.67879: done getting the remaining hosts for this loop 25052 1726882476.67883: getting the next task for host managed_node2 25052 1726882476.67891: done getting next task for host managed_node2 25052 1726882476.67897: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25052 1726882476.67900: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882476.67914: getting variables 25052 1726882476.67915: in VariableManager get_vars() 25052 1726882476.67962: Calling all_inventory to load vars for managed_node2 25052 1726882476.67965: Calling groups_inventory to load vars for managed_node2 25052 1726882476.67967: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882476.67978: Calling all_plugins_play to load vars for managed_node2 25052 1726882476.67980: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882476.67983: Calling groups_plugins_play to load vars for managed_node2 25052 1726882476.69208: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000026 25052 1726882476.70082: WORKER PROCESS EXITING 25052 1726882476.70769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882476.73779: done with get_vars() 25052 1726882476.73910: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:36 -0400 (0:00:00.093) 0:00:13.694 ****** 25052 1726882476.74099: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25052 1726882476.74102: Creating lock for fedora.linux_system_roles.network_connections 25052 1726882476.74643: worker is 1 (out of 1 available) 25052 1726882476.74655: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25052 1726882476.74667: done queuing things up, now waiting for results queue to drain 25052 1726882476.74668: waiting for pending results... 25052 1726882476.75395: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25052 1726882476.75629: in run() - task 12673a56-9f93-f7f6-4a6d-000000000027 25052 1726882476.75649: variable 'ansible_search_path' from source: unknown 25052 1726882476.75657: variable 'ansible_search_path' from source: unknown 25052 1726882476.75700: calling self._execute() 25052 1726882476.75896: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.76038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.76042: variable 'omit' from source: magic vars 25052 1726882476.76909: variable 'ansible_distribution_major_version' from source: facts 25052 1726882476.76913: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882476.76915: variable 'omit' from source: magic vars 25052 1726882476.76917: variable 'omit' from source: magic vars 25052 1726882476.77399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882476.81655: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882476.81846: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882476.82102: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882476.82105: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882476.82108: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882476.82251: variable 'network_provider' from source: set_fact 25052 1726882476.82518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882476.82641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882476.82673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882476.82971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882476.82975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882476.83019: variable 'omit' from source: magic vars 25052 1726882476.83274: variable 'omit' from source: magic vars 25052 1726882476.83506: variable 'network_connections' from source: task vars 25052 1726882476.83736: variable 'interface' from source: play vars 25052 1726882476.83739: variable 'interface' from source: play vars 25052 1726882476.83944: variable 'omit' from source: magic vars 25052 1726882476.84074: variable '__lsr_ansible_managed' from source: task vars 25052 1726882476.84198: variable '__lsr_ansible_managed' from source: task vars 25052 1726882476.84790: Loaded config def from plugin (lookup/template) 25052 1726882476.84805: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25052 1726882476.84998: File lookup term: get_ansible_managed.j2 25052 1726882476.85001: variable 'ansible_search_path' from source: unknown 25052 1726882476.85004: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25052 1726882476.85009: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25052 1726882476.85011: variable 'ansible_search_path' from source: unknown 25052 1726882476.97744: variable 'ansible_managed' from source: unknown 25052 1726882476.97918: variable 'omit' from source: magic vars 25052 1726882476.97952: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882476.97982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882476.98016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882476.98038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882476.98053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882476.98084: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882476.98096: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.98105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.98224: Set connection var ansible_pipelining to False 25052 1726882476.98228: Set connection var ansible_connection to ssh 25052 1726882476.98230: Set connection var ansible_shell_type to sh 25052 1726882476.98232: Set connection var ansible_timeout to 10 25052 1726882476.98244: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882476.98298: Set connection var ansible_shell_executable to /bin/sh 25052 1726882476.98301: variable 'ansible_shell_executable' from source: unknown 25052 1726882476.98304: variable 'ansible_connection' from source: unknown 25052 1726882476.98306: variable 'ansible_module_compression' from source: unknown 25052 1726882476.98308: variable 'ansible_shell_type' from source: unknown 25052 1726882476.98310: variable 'ansible_shell_executable' from source: unknown 25052 1726882476.98312: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882476.98314: variable 'ansible_pipelining' from source: unknown 25052 1726882476.98322: variable 'ansible_timeout' from source: unknown 25052 1726882476.98336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882476.98473: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882476.98550: variable 'omit' from source: magic vars 25052 1726882476.98554: starting attempt loop 25052 1726882476.98556: running the handler 25052 1726882476.98558: _low_level_execute_command(): starting 25052 1726882476.98561: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882476.99278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882476.99399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882476.99404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882476.99407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882476.99434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882476.99657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882476.99796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882477.01632: stdout chunk (state=3): >>>/root <<< 25052 1726882477.01708: stdout chunk (state=3): >>><<< 25052 1726882477.01900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882477.01903: stderr chunk (state=3): >>><<< 25052 1726882477.01906: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882477.01909: _low_level_execute_command(): starting 25052 1726882477.01912: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444 `" && echo ansible-tmp-1726882477.0185065-25719-217503584467444="` echo /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444 `" ) && sleep 0' 25052 1726882477.03777: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882477.04136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882477.04200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882477.06279: stdout chunk (state=3): >>>ansible-tmp-1726882477.0185065-25719-217503584467444=/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444 <<< 25052 1726882477.06422: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882477.06426: stdout chunk (state=3): >>><<< 25052 1726882477.06434: stderr chunk (state=3): >>><<< 25052 1726882477.06454: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882477.0185065-25719-217503584467444=/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882477.06503: variable 'ansible_module_compression' from source: unknown 25052 1726882477.06548: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 25052 1726882477.06552: ANSIBALLZ: Acquiring lock 25052 1726882477.06554: ANSIBALLZ: Lock acquired: 140207134536192 25052 1726882477.06557: ANSIBALLZ: Creating module 25052 1726882477.51631: ANSIBALLZ: Writing module into payload 25052 1726882477.52254: ANSIBALLZ: Writing module 25052 1726882477.52501: ANSIBALLZ: Renaming module 25052 1726882477.52505: ANSIBALLZ: Done creating module 25052 1726882477.52507: variable 'ansible_facts' from source: unknown 25052 1726882477.52583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py 25052 1726882477.53071: Sending initial data 25052 1726882477.53074: Sent initial data (168 bytes) 25052 1726882477.54483: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882477.54590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882477.54705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882477.54718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882477.54925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882477.56714: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25052 1726882477.56738: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882477.56880: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882477.56906: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmplcow8dle /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py <<< 25052 1726882477.56917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py" <<< 25052 1726882477.56995: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmplcow8dle" to remote "/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py" <<< 25052 1726882477.60276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882477.60340: stderr chunk (state=3): >>><<< 25052 1726882477.60527: stdout chunk (state=3): >>><<< 25052 1726882477.60531: done transferring module to remote 25052 1726882477.60533: _low_level_execute_command(): starting 25052 1726882477.60536: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/ /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py && sleep 0' 25052 1726882477.61842: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882477.61951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882477.62064: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882477.62077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882477.62171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882477.64015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882477.64200: stderr chunk (state=3): >>><<< 25052 1726882477.64204: stdout chunk (state=3): >>><<< 25052 1726882477.64206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882477.64209: _low_level_execute_command(): starting 25052 1726882477.64212: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/AnsiballZ_network_connections.py && sleep 0' 25052 1726882477.65463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882477.65480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882477.65507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882477.65723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882477.65912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882477.66418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882479.72548: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25052 1726882479.74512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882479.74516: stdout chunk (state=3): >>><<< 25052 1726882479.74519: stderr chunk (state=3): >>><<< 25052 1726882479.74545: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "type": "ethernet", "state": "up", "ip": {"dhcp4": false, "auto6": false, "address": ["2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32"], "gateway6": "2001:db8::1"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882479.74609: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'type': 'ethernet', 'state': 'up', 'ip': {'dhcp4': False, 'auto6': False, 'address': ['2001:db8::2/32', '2001:db8::3/32', '2001:db8::4/32'], 'gateway6': '2001:db8::1'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882479.74623: _low_level_execute_command(): starting 25052 1726882479.74640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882477.0185065-25719-217503584467444/ > /dev/null 2>&1 && sleep 0' 25052 1726882479.75233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882479.75271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882479.75374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882479.75432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882479.77411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882479.77415: stdout chunk (state=3): >>><<< 25052 1726882479.77418: stderr chunk (state=3): >>><<< 25052 1726882479.77425: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882479.77428: handler run complete 25052 1726882479.77430: attempt loop complete, returning result 25052 1726882479.77432: _execute() done 25052 1726882479.77434: dumping result to json 25052 1726882479.77438: done dumping result, returning 25052 1726882479.77476: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-f7f6-4a6d-000000000027] 25052 1726882479.77479: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000027 25052 1726882479.77637: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000027 25052 1726882479.77640: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc [004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active) 25052 1726882479.77760: no more pending results, returning what we have 25052 1726882479.77763: results queue empty 25052 1726882479.77765: checking for any_errors_fatal 25052 1726882479.77771: done checking for any_errors_fatal 25052 1726882479.77771: checking for max_fail_percentage 25052 1726882479.77773: done checking for max_fail_percentage 25052 1726882479.77773: checking to see if all hosts have failed and the running result is not ok 25052 1726882479.77774: done checking to see if all hosts have failed 25052 1726882479.77775: getting the remaining hosts for this loop 25052 1726882479.77776: done getting the remaining hosts for this loop 25052 1726882479.77779: getting the next task for host managed_node2 25052 1726882479.77786: done getting next task for host managed_node2 25052 1726882479.77789: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25052 1726882479.77969: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882479.77997: getting variables 25052 1726882479.77999: in VariableManager get_vars() 25052 1726882479.78126: Calling all_inventory to load vars for managed_node2 25052 1726882479.78129: Calling groups_inventory to load vars for managed_node2 25052 1726882479.78132: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882479.78141: Calling all_plugins_play to load vars for managed_node2 25052 1726882479.78143: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882479.78166: Calling groups_plugins_play to load vars for managed_node2 25052 1726882479.79833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882479.80730: done with get_vars() 25052 1726882479.80748: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:39 -0400 (0:00:03.068) 0:00:16.762 ****** 25052 1726882479.80813: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25052 1726882479.80815: Creating lock for fedora.linux_system_roles.network_state 25052 1726882479.81070: worker is 1 (out of 1 available) 25052 1726882479.81084: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25052 1726882479.81097: done queuing things up, now waiting for results queue to drain 25052 1726882479.81099: waiting for pending results... 25052 1726882479.81276: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 25052 1726882479.81367: in run() - task 12673a56-9f93-f7f6-4a6d-000000000028 25052 1726882479.81379: variable 'ansible_search_path' from source: unknown 25052 1726882479.81382: variable 'ansible_search_path' from source: unknown 25052 1726882479.81417: calling self._execute() 25052 1726882479.81490: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.81498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.81505: variable 'omit' from source: magic vars 25052 1726882479.81774: variable 'ansible_distribution_major_version' from source: facts 25052 1726882479.81784: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882479.81871: variable 'network_state' from source: role '' defaults 25052 1726882479.81874: Evaluated conditional (network_state != {}): False 25052 1726882479.81877: when evaluation is False, skipping this task 25052 1726882479.81879: _execute() done 25052 1726882479.81882: dumping result to json 25052 1726882479.81896: done dumping result, returning 25052 1726882479.81899: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-f7f6-4a6d-000000000028] 25052 1726882479.81902: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000028 25052 1726882479.81978: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000028 25052 1726882479.81981: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882479.82046: no more pending results, returning what we have 25052 1726882479.82050: results queue empty 25052 1726882479.82050: checking for any_errors_fatal 25052 1726882479.82062: done checking for any_errors_fatal 25052 1726882479.82063: checking for max_fail_percentage 25052 1726882479.82064: done checking for max_fail_percentage 25052 1726882479.82065: checking to see if all hosts have failed and the running result is not ok 25052 1726882479.82066: done checking to see if all hosts have failed 25052 1726882479.82066: getting the remaining hosts for this loop 25052 1726882479.82068: done getting the remaining hosts for this loop 25052 1726882479.82072: getting the next task for host managed_node2 25052 1726882479.82079: done getting next task for host managed_node2 25052 1726882479.82084: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25052 1726882479.82086: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882479.82106: getting variables 25052 1726882479.82108: in VariableManager get_vars() 25052 1726882479.82142: Calling all_inventory to load vars for managed_node2 25052 1726882479.82145: Calling groups_inventory to load vars for managed_node2 25052 1726882479.82147: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882479.82154: Calling all_plugins_play to load vars for managed_node2 25052 1726882479.82156: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882479.82158: Calling groups_plugins_play to load vars for managed_node2 25052 1726882479.83351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882479.84810: done with get_vars() 25052 1726882479.84837: done getting variables 25052 1726882479.84902: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:39 -0400 (0:00:00.041) 0:00:16.804 ****** 25052 1726882479.84937: entering _queue_task() for managed_node2/debug 25052 1726882479.85275: worker is 1 (out of 1 available) 25052 1726882479.85288: exiting _queue_task() for managed_node2/debug 25052 1726882479.85500: done queuing things up, now waiting for results queue to drain 25052 1726882479.85502: waiting for pending results... 25052 1726882479.85631: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25052 1726882479.85729: in run() - task 12673a56-9f93-f7f6-4a6d-000000000029 25052 1726882479.85750: variable 'ansible_search_path' from source: unknown 25052 1726882479.85759: variable 'ansible_search_path' from source: unknown 25052 1726882479.85836: calling self._execute() 25052 1726882479.85903: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.85916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.85933: variable 'omit' from source: magic vars 25052 1726882479.86312: variable 'ansible_distribution_major_version' from source: facts 25052 1726882479.86330: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882479.86377: variable 'omit' from source: magic vars 25052 1726882479.86412: variable 'omit' from source: magic vars 25052 1726882479.86451: variable 'omit' from source: magic vars 25052 1726882479.86501: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882479.86540: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882479.86594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882479.86598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882479.86612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882479.86651: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882479.86661: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.86700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.86779: Set connection var ansible_pipelining to False 25052 1726882479.86787: Set connection var ansible_connection to ssh 25052 1726882479.86795: Set connection var ansible_shell_type to sh 25052 1726882479.86813: Set connection var ansible_timeout to 10 25052 1726882479.86827: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882479.86898: Set connection var ansible_shell_executable to /bin/sh 25052 1726882479.86901: variable 'ansible_shell_executable' from source: unknown 25052 1726882479.86903: variable 'ansible_connection' from source: unknown 25052 1726882479.86906: variable 'ansible_module_compression' from source: unknown 25052 1726882479.86908: variable 'ansible_shell_type' from source: unknown 25052 1726882479.86914: variable 'ansible_shell_executable' from source: unknown 25052 1726882479.86917: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.86919: variable 'ansible_pipelining' from source: unknown 25052 1726882479.86921: variable 'ansible_timeout' from source: unknown 25052 1726882479.86923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.87051: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882479.87068: variable 'omit' from source: magic vars 25052 1726882479.87078: starting attempt loop 25052 1726882479.87085: running the handler 25052 1726882479.87214: variable '__network_connections_result' from source: set_fact 25052 1726882479.87273: handler run complete 25052 1726882479.87348: attempt loop complete, returning result 25052 1726882479.87351: _execute() done 25052 1726882479.87353: dumping result to json 25052 1726882479.87355: done dumping result, returning 25052 1726882479.87358: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-f7f6-4a6d-000000000029] 25052 1726882479.87360: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000029 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active)" ] } 25052 1726882479.87518: no more pending results, returning what we have 25052 1726882479.87521: results queue empty 25052 1726882479.87522: checking for any_errors_fatal 25052 1726882479.87529: done checking for any_errors_fatal 25052 1726882479.87530: checking for max_fail_percentage 25052 1726882479.87532: done checking for max_fail_percentage 25052 1726882479.87533: checking to see if all hosts have failed and the running result is not ok 25052 1726882479.87534: done checking to see if all hosts have failed 25052 1726882479.87535: getting the remaining hosts for this loop 25052 1726882479.87536: done getting the remaining hosts for this loop 25052 1726882479.87540: getting the next task for host managed_node2 25052 1726882479.87547: done getting next task for host managed_node2 25052 1726882479.87551: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25052 1726882479.87554: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882479.87565: getting variables 25052 1726882479.87568: in VariableManager get_vars() 25052 1726882479.87611: Calling all_inventory to load vars for managed_node2 25052 1726882479.87615: Calling groups_inventory to load vars for managed_node2 25052 1726882479.87618: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882479.87629: Calling all_plugins_play to load vars for managed_node2 25052 1726882479.87632: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882479.87635: Calling groups_plugins_play to load vars for managed_node2 25052 1726882479.88306: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000029 25052 1726882479.88309: WORKER PROCESS EXITING 25052 1726882479.88977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882479.89934: done with get_vars() 25052 1726882479.89949: done getting variables 25052 1726882479.89990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:39 -0400 (0:00:00.050) 0:00:16.855 ****** 25052 1726882479.90020: entering _queue_task() for managed_node2/debug 25052 1726882479.90259: worker is 1 (out of 1 available) 25052 1726882479.90272: exiting _queue_task() for managed_node2/debug 25052 1726882479.90285: done queuing things up, now waiting for results queue to drain 25052 1726882479.90286: waiting for pending results... 25052 1726882479.90466: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25052 1726882479.90553: in run() - task 12673a56-9f93-f7f6-4a6d-00000000002a 25052 1726882479.90566: variable 'ansible_search_path' from source: unknown 25052 1726882479.90569: variable 'ansible_search_path' from source: unknown 25052 1726882479.90599: calling self._execute() 25052 1726882479.90670: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.90674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.90683: variable 'omit' from source: magic vars 25052 1726882479.90949: variable 'ansible_distribution_major_version' from source: facts 25052 1726882479.90960: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882479.90963: variable 'omit' from source: magic vars 25052 1726882479.91015: variable 'omit' from source: magic vars 25052 1726882479.91039: variable 'omit' from source: magic vars 25052 1726882479.91103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882479.91187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882479.91194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882479.91198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882479.91200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882479.91202: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882479.91204: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.91206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.91374: Set connection var ansible_pipelining to False 25052 1726882479.91377: Set connection var ansible_connection to ssh 25052 1726882479.91380: Set connection var ansible_shell_type to sh 25052 1726882479.91382: Set connection var ansible_timeout to 10 25052 1726882479.91384: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882479.91386: Set connection var ansible_shell_executable to /bin/sh 25052 1726882479.91388: variable 'ansible_shell_executable' from source: unknown 25052 1726882479.91390: variable 'ansible_connection' from source: unknown 25052 1726882479.91397: variable 'ansible_module_compression' from source: unknown 25052 1726882479.91399: variable 'ansible_shell_type' from source: unknown 25052 1726882479.91401: variable 'ansible_shell_executable' from source: unknown 25052 1726882479.91403: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.91405: variable 'ansible_pipelining' from source: unknown 25052 1726882479.91407: variable 'ansible_timeout' from source: unknown 25052 1726882479.91408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.91542: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882479.91558: variable 'omit' from source: magic vars 25052 1726882479.91567: starting attempt loop 25052 1726882479.91573: running the handler 25052 1726882479.91639: variable '__network_connections_result' from source: set_fact 25052 1726882479.91732: variable '__network_connections_result' from source: set_fact 25052 1726882479.91872: handler run complete 25052 1726882479.91910: attempt loop complete, returning result 25052 1726882479.91919: _execute() done 25052 1726882479.91926: dumping result to json 25052 1726882479.91935: done dumping result, returning 25052 1726882479.91946: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-f7f6-4a6d-00000000002a] 25052 1726882479.91969: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002a ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "ip": { "address": [ "2001:db8::2/32", "2001:db8::3/32", "2001:db8::4/32" ], "auto6": false, "dhcp4": false, "gateway6": "2001:db8::1" }, "name": "veth0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc\n[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'veth0': add connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc", "[004] #0, state:up persistent_state:present, 'veth0': up connection veth0, 55dc2a1c-03d2-45b8-a3e7-a9c369c581cc (not-active)" ] } } 25052 1726882479.92169: no more pending results, returning what we have 25052 1726882479.92175: results queue empty 25052 1726882479.92176: checking for any_errors_fatal 25052 1726882479.92182: done checking for any_errors_fatal 25052 1726882479.92182: checking for max_fail_percentage 25052 1726882479.92184: done checking for max_fail_percentage 25052 1726882479.92185: checking to see if all hosts have failed and the running result is not ok 25052 1726882479.92185: done checking to see if all hosts have failed 25052 1726882479.92186: getting the remaining hosts for this loop 25052 1726882479.92187: done getting the remaining hosts for this loop 25052 1726882479.92190: getting the next task for host managed_node2 25052 1726882479.92208: done getting next task for host managed_node2 25052 1726882479.92211: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25052 1726882479.92214: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882479.92223: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002a 25052 1726882479.92225: WORKER PROCESS EXITING 25052 1726882479.92232: getting variables 25052 1726882479.92233: in VariableManager get_vars() 25052 1726882479.92269: Calling all_inventory to load vars for managed_node2 25052 1726882479.92276: Calling groups_inventory to load vars for managed_node2 25052 1726882479.92278: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882479.92286: Calling all_plugins_play to load vars for managed_node2 25052 1726882479.92289: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882479.92295: Calling groups_plugins_play to load vars for managed_node2 25052 1726882479.93081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882479.93956: done with get_vars() 25052 1726882479.93972: done getting variables 25052 1726882479.94016: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:39 -0400 (0:00:00.040) 0:00:16.895 ****** 25052 1726882479.94039: entering _queue_task() for managed_node2/debug 25052 1726882479.94262: worker is 1 (out of 1 available) 25052 1726882479.94273: exiting _queue_task() for managed_node2/debug 25052 1726882479.94284: done queuing things up, now waiting for results queue to drain 25052 1726882479.94286: waiting for pending results... 25052 1726882479.94453: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25052 1726882479.94543: in run() - task 12673a56-9f93-f7f6-4a6d-00000000002b 25052 1726882479.94553: variable 'ansible_search_path' from source: unknown 25052 1726882479.94557: variable 'ansible_search_path' from source: unknown 25052 1726882479.94583: calling self._execute() 25052 1726882479.94653: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.94656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.94665: variable 'omit' from source: magic vars 25052 1726882479.94929: variable 'ansible_distribution_major_version' from source: facts 25052 1726882479.94938: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882479.95028: variable 'network_state' from source: role '' defaults 25052 1726882479.95044: Evaluated conditional (network_state != {}): False 25052 1726882479.95064: when evaluation is False, skipping this task 25052 1726882479.95068: _execute() done 25052 1726882479.95070: dumping result to json 25052 1726882479.95073: done dumping result, returning 25052 1726882479.95076: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-f7f6-4a6d-00000000002b] 25052 1726882479.95078: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002b 25052 1726882479.95160: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002b 25052 1726882479.95162: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 25052 1726882479.95240: no more pending results, returning what we have 25052 1726882479.95244: results queue empty 25052 1726882479.95244: checking for any_errors_fatal 25052 1726882479.95252: done checking for any_errors_fatal 25052 1726882479.95253: checking for max_fail_percentage 25052 1726882479.95254: done checking for max_fail_percentage 25052 1726882479.95255: checking to see if all hosts have failed and the running result is not ok 25052 1726882479.95256: done checking to see if all hosts have failed 25052 1726882479.95257: getting the remaining hosts for this loop 25052 1726882479.95258: done getting the remaining hosts for this loop 25052 1726882479.95261: getting the next task for host managed_node2 25052 1726882479.95267: done getting next task for host managed_node2 25052 1726882479.95270: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25052 1726882479.95273: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882479.95285: getting variables 25052 1726882479.95286: in VariableManager get_vars() 25052 1726882479.95320: Calling all_inventory to load vars for managed_node2 25052 1726882479.95323: Calling groups_inventory to load vars for managed_node2 25052 1726882479.95325: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882479.95332: Calling all_plugins_play to load vars for managed_node2 25052 1726882479.95335: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882479.95337: Calling groups_plugins_play to load vars for managed_node2 25052 1726882479.96825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882479.98503: done with get_vars() 25052 1726882479.98535: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:39 -0400 (0:00:00.045) 0:00:16.941 ****** 25052 1726882479.98643: entering _queue_task() for managed_node2/ping 25052 1726882479.98645: Creating lock for ping 25052 1726882479.98989: worker is 1 (out of 1 available) 25052 1726882479.99003: exiting _queue_task() for managed_node2/ping 25052 1726882479.99015: done queuing things up, now waiting for results queue to drain 25052 1726882479.99016: waiting for pending results... 25052 1726882479.99314: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25052 1726882479.99373: in run() - task 12673a56-9f93-f7f6-4a6d-00000000002c 25052 1726882479.99387: variable 'ansible_search_path' from source: unknown 25052 1726882479.99391: variable 'ansible_search_path' from source: unknown 25052 1726882479.99433: calling self._execute() 25052 1726882479.99522: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882479.99600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882479.99603: variable 'omit' from source: magic vars 25052 1726882479.99883: variable 'ansible_distribution_major_version' from source: facts 25052 1726882479.99898: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882479.99906: variable 'omit' from source: magic vars 25052 1726882479.99963: variable 'omit' from source: magic vars 25052 1726882479.99999: variable 'omit' from source: magic vars 25052 1726882480.00039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882480.00072: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882480.00091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882480.00112: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882480.00124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882480.00169: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882480.00172: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.00174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.00498: Set connection var ansible_pipelining to False 25052 1726882480.00502: Set connection var ansible_connection to ssh 25052 1726882480.00504: Set connection var ansible_shell_type to sh 25052 1726882480.00506: Set connection var ansible_timeout to 10 25052 1726882480.00509: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882480.00511: Set connection var ansible_shell_executable to /bin/sh 25052 1726882480.00513: variable 'ansible_shell_executable' from source: unknown 25052 1726882480.00515: variable 'ansible_connection' from source: unknown 25052 1726882480.00518: variable 'ansible_module_compression' from source: unknown 25052 1726882480.00520: variable 'ansible_shell_type' from source: unknown 25052 1726882480.00522: variable 'ansible_shell_executable' from source: unknown 25052 1726882480.00524: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.00526: variable 'ansible_pipelining' from source: unknown 25052 1726882480.00528: variable 'ansible_timeout' from source: unknown 25052 1726882480.00530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.00533: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882480.00536: variable 'omit' from source: magic vars 25052 1726882480.00538: starting attempt loop 25052 1726882480.00541: running the handler 25052 1726882480.00543: _low_level_execute_command(): starting 25052 1726882480.00554: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882480.01255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.01285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.01302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.01319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.01334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882480.01342: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882480.01353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.01368: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882480.01378: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882480.01384: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882480.01392: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.01414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.01425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.01434: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882480.01439: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882480.01450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.01519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.01532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.01560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.01660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.03435: stdout chunk (state=3): >>>/root <<< 25052 1726882480.03439: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.03441: stdout chunk (state=3): >>><<< 25052 1726882480.03443: stderr chunk (state=3): >>><<< 25052 1726882480.03446: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.03451: _low_level_execute_command(): starting 25052 1726882480.03503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323 `" && echo ansible-tmp-1726882480.0343907-25825-2960467185323="` echo /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323 `" ) && sleep 0' 25052 1726882480.04400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.04406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882480.04568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.04618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.07201: stdout chunk (state=3): >>>ansible-tmp-1726882480.0343907-25825-2960467185323=/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323 <<< 25052 1726882480.07204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.07206: stdout chunk (state=3): >>><<< 25052 1726882480.07208: stderr chunk (state=3): >>><<< 25052 1726882480.07210: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882480.0343907-25825-2960467185323=/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.07213: variable 'ansible_module_compression' from source: unknown 25052 1726882480.07214: ANSIBALLZ: Using lock for ping 25052 1726882480.07216: ANSIBALLZ: Acquiring lock 25052 1726882480.07218: ANSIBALLZ: Lock acquired: 140207136475088 25052 1726882480.07220: ANSIBALLZ: Creating module 25052 1726882480.21386: ANSIBALLZ: Writing module into payload 25052 1726882480.21458: ANSIBALLZ: Writing module 25052 1726882480.21486: ANSIBALLZ: Renaming module 25052 1726882480.21504: ANSIBALLZ: Done creating module 25052 1726882480.21527: variable 'ansible_facts' from source: unknown 25052 1726882480.21603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py 25052 1726882480.21818: Sending initial data 25052 1726882480.21828: Sent initial data (151 bytes) 25052 1726882480.23014: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.23052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.23117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.23258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.24806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882480.24911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882480.25059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpr7576yzh /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py <<< 25052 1726882480.25113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py" <<< 25052 1726882480.25257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpr7576yzh" to remote "/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py" <<< 25052 1726882480.27710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.27740: stderr chunk (state=3): >>><<< 25052 1726882480.27761: stdout chunk (state=3): >>><<< 25052 1726882480.27917: done transferring module to remote 25052 1726882480.27924: _low_level_execute_command(): starting 25052 1726882480.27926: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/ /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py && sleep 0' 25052 1726882480.29567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.29684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.29687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.29710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.29732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882480.29739: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882480.29742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.29798: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882480.29802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882480.29804: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882480.29806: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.29809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.29919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.29922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.29925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.30040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.31805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.31830: stderr chunk (state=3): >>><<< 25052 1726882480.31833: stdout chunk (state=3): >>><<< 25052 1726882480.31921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.31924: _low_level_execute_command(): starting 25052 1726882480.31927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/AnsiballZ_ping.py && sleep 0' 25052 1726882480.32468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.32561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.32591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.32608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.32627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.32720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.47497: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25052 1726882480.48712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882480.48773: stderr chunk (state=3): >>><<< 25052 1726882480.48776: stdout chunk (state=3): >>><<< 25052 1726882480.48913: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882480.48917: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882480.48920: _low_level_execute_command(): starting 25052 1726882480.48922: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882480.0343907-25825-2960467185323/ > /dev/null 2>&1 && sleep 0' 25052 1726882480.50098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.50120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.50176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.50256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.50275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.50434: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.52217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.52269: stderr chunk (state=3): >>><<< 25052 1726882480.52273: stdout chunk (state=3): >>><<< 25052 1726882480.52305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.52311: handler run complete 25052 1726882480.52328: attempt loop complete, returning result 25052 1726882480.52331: _execute() done 25052 1726882480.52333: dumping result to json 25052 1726882480.52335: done dumping result, returning 25052 1726882480.52398: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-f7f6-4a6d-00000000002c] 25052 1726882480.52402: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002c 25052 1726882480.52462: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000002c 25052 1726882480.52464: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 25052 1726882480.52560: no more pending results, returning what we have 25052 1726882480.52564: results queue empty 25052 1726882480.52565: checking for any_errors_fatal 25052 1726882480.52572: done checking for any_errors_fatal 25052 1726882480.52573: checking for max_fail_percentage 25052 1726882480.52576: done checking for max_fail_percentage 25052 1726882480.52576: checking to see if all hosts have failed and the running result is not ok 25052 1726882480.52577: done checking to see if all hosts have failed 25052 1726882480.52578: getting the remaining hosts for this loop 25052 1726882480.52582: done getting the remaining hosts for this loop 25052 1726882480.52586: getting the next task for host managed_node2 25052 1726882480.52807: done getting next task for host managed_node2 25052 1726882480.52810: ^ task is: TASK: meta (role_complete) 25052 1726882480.52812: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882480.52825: getting variables 25052 1726882480.52826: in VariableManager get_vars() 25052 1726882480.52865: Calling all_inventory to load vars for managed_node2 25052 1726882480.52868: Calling groups_inventory to load vars for managed_node2 25052 1726882480.52871: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.52880: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.52883: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.52886: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.54947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.57302: done with get_vars() 25052 1726882480.57444: done getting variables 25052 1726882480.57654: done queuing things up, now waiting for results queue to drain 25052 1726882480.57656: results queue empty 25052 1726882480.57657: checking for any_errors_fatal 25052 1726882480.57660: done checking for any_errors_fatal 25052 1726882480.57660: checking for max_fail_percentage 25052 1726882480.57661: done checking for max_fail_percentage 25052 1726882480.57662: checking to see if all hosts have failed and the running result is not ok 25052 1726882480.57663: done checking to see if all hosts have failed 25052 1726882480.57663: getting the remaining hosts for this loop 25052 1726882480.57664: done getting the remaining hosts for this loop 25052 1726882480.57667: getting the next task for host managed_node2 25052 1726882480.57671: done getting next task for host managed_node2 25052 1726882480.57673: ^ task is: TASK: Include the task 'assert_device_present.yml' 25052 1726882480.57675: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882480.57677: getting variables 25052 1726882480.57678: in VariableManager get_vars() 25052 1726882480.57763: Calling all_inventory to load vars for managed_node2 25052 1726882480.57766: Calling groups_inventory to load vars for managed_node2 25052 1726882480.57768: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.57773: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.57775: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.57778: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.59543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.61861: done with get_vars() 25052 1726882480.61885: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:47 Friday 20 September 2024 21:34:40 -0400 (0:00:00.633) 0:00:17.574 ****** 25052 1726882480.61975: entering _queue_task() for managed_node2/include_tasks 25052 1726882480.62410: worker is 1 (out of 1 available) 25052 1726882480.62424: exiting _queue_task() for managed_node2/include_tasks 25052 1726882480.62436: done queuing things up, now waiting for results queue to drain 25052 1726882480.62437: waiting for pending results... 25052 1726882480.62909: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 25052 1726882480.62914: in run() - task 12673a56-9f93-f7f6-4a6d-00000000005c 25052 1726882480.62917: variable 'ansible_search_path' from source: unknown 25052 1726882480.62920: calling self._execute() 25052 1726882480.62923: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.62934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.62949: variable 'omit' from source: magic vars 25052 1726882480.63292: variable 'ansible_distribution_major_version' from source: facts 25052 1726882480.63313: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882480.63325: _execute() done 25052 1726882480.63334: dumping result to json 25052 1726882480.63341: done dumping result, returning 25052 1726882480.63351: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-f7f6-4a6d-00000000005c] 25052 1726882480.63360: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005c 25052 1726882480.63485: no more pending results, returning what we have 25052 1726882480.63490: in VariableManager get_vars() 25052 1726882480.63542: Calling all_inventory to load vars for managed_node2 25052 1726882480.63610: Calling groups_inventory to load vars for managed_node2 25052 1726882480.63613: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.63630: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.63638: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.63643: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.64248: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005c 25052 1726882480.64251: WORKER PROCESS EXITING 25052 1726882480.65788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.67745: done with get_vars() 25052 1726882480.67783: variable 'ansible_search_path' from source: unknown 25052 1726882480.67832: we have included files to process 25052 1726882480.67834: generating all_blocks data 25052 1726882480.67836: done generating all_blocks data 25052 1726882480.67844: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25052 1726882480.67845: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25052 1726882480.67849: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 25052 1726882480.68045: in VariableManager get_vars() 25052 1726882480.68077: done with get_vars() 25052 1726882480.68190: done processing included file 25052 1726882480.68194: iterating over new_blocks loaded from include file 25052 1726882480.68196: in VariableManager get_vars() 25052 1726882480.68213: done with get_vars() 25052 1726882480.68215: filtering new block on tags 25052 1726882480.68233: done filtering new block on tags 25052 1726882480.68235: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 25052 1726882480.68241: extending task lists for all hosts with included blocks 25052 1726882480.71445: done extending task lists 25052 1726882480.71447: done processing included files 25052 1726882480.71447: results queue empty 25052 1726882480.71448: checking for any_errors_fatal 25052 1726882480.71450: done checking for any_errors_fatal 25052 1726882480.71451: checking for max_fail_percentage 25052 1726882480.71452: done checking for max_fail_percentage 25052 1726882480.71453: checking to see if all hosts have failed and the running result is not ok 25052 1726882480.71454: done checking to see if all hosts have failed 25052 1726882480.71455: getting the remaining hosts for this loop 25052 1726882480.71456: done getting the remaining hosts for this loop 25052 1726882480.71458: getting the next task for host managed_node2 25052 1726882480.71462: done getting next task for host managed_node2 25052 1726882480.71464: ^ task is: TASK: Include the task 'get_interface_stat.yml' 25052 1726882480.71467: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882480.71469: getting variables 25052 1726882480.71470: in VariableManager get_vars() 25052 1726882480.71487: Calling all_inventory to load vars for managed_node2 25052 1726882480.71490: Calling groups_inventory to load vars for managed_node2 25052 1726882480.71492: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.71500: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.71503: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.71506: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.72668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.75149: done with get_vars() 25052 1726882480.75180: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:34:40 -0400 (0:00:00.132) 0:00:17.707 ****** 25052 1726882480.75266: entering _queue_task() for managed_node2/include_tasks 25052 1726882480.75620: worker is 1 (out of 1 available) 25052 1726882480.75633: exiting _queue_task() for managed_node2/include_tasks 25052 1726882480.75644: done queuing things up, now waiting for results queue to drain 25052 1726882480.75645: waiting for pending results... 25052 1726882480.75917: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 25052 1726882480.76017: in run() - task 12673a56-9f93-f7f6-4a6d-0000000002b5 25052 1726882480.76039: variable 'ansible_search_path' from source: unknown 25052 1726882480.76047: variable 'ansible_search_path' from source: unknown 25052 1726882480.76085: calling self._execute() 25052 1726882480.76185: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.76199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.76217: variable 'omit' from source: magic vars 25052 1726882480.76659: variable 'ansible_distribution_major_version' from source: facts 25052 1726882480.76663: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882480.76665: _execute() done 25052 1726882480.76668: dumping result to json 25052 1726882480.76670: done dumping result, returning 25052 1726882480.76673: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-f7f6-4a6d-0000000002b5] 25052 1726882480.76675: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000002b5 25052 1726882480.76747: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000002b5 25052 1726882480.76750: WORKER PROCESS EXITING 25052 1726882480.76780: no more pending results, returning what we have 25052 1726882480.76786: in VariableManager get_vars() 25052 1726882480.76839: Calling all_inventory to load vars for managed_node2 25052 1726882480.76842: Calling groups_inventory to load vars for managed_node2 25052 1726882480.76845: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.76859: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.76863: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.76866: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.78534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.80012: done with get_vars() 25052 1726882480.80031: variable 'ansible_search_path' from source: unknown 25052 1726882480.80032: variable 'ansible_search_path' from source: unknown 25052 1726882480.80067: we have included files to process 25052 1726882480.80069: generating all_blocks data 25052 1726882480.80070: done generating all_blocks data 25052 1726882480.80072: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25052 1726882480.80073: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25052 1726882480.80075: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 25052 1726882480.80299: done processing included file 25052 1726882480.80301: iterating over new_blocks loaded from include file 25052 1726882480.80302: in VariableManager get_vars() 25052 1726882480.80320: done with get_vars() 25052 1726882480.80321: filtering new block on tags 25052 1726882480.80335: done filtering new block on tags 25052 1726882480.80337: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 25052 1726882480.80341: extending task lists for all hosts with included blocks 25052 1726882480.80440: done extending task lists 25052 1726882480.80441: done processing included files 25052 1726882480.80442: results queue empty 25052 1726882480.80442: checking for any_errors_fatal 25052 1726882480.80445: done checking for any_errors_fatal 25052 1726882480.80446: checking for max_fail_percentage 25052 1726882480.80447: done checking for max_fail_percentage 25052 1726882480.80447: checking to see if all hosts have failed and the running result is not ok 25052 1726882480.80448: done checking to see if all hosts have failed 25052 1726882480.80449: getting the remaining hosts for this loop 25052 1726882480.80450: done getting the remaining hosts for this loop 25052 1726882480.80452: getting the next task for host managed_node2 25052 1726882480.80457: done getting next task for host managed_node2 25052 1726882480.80459: ^ task is: TASK: Get stat for interface {{ interface }} 25052 1726882480.80462: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882480.80464: getting variables 25052 1726882480.80465: in VariableManager get_vars() 25052 1726882480.80478: Calling all_inventory to load vars for managed_node2 25052 1726882480.80480: Calling groups_inventory to load vars for managed_node2 25052 1726882480.80482: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882480.80487: Calling all_plugins_play to load vars for managed_node2 25052 1726882480.80489: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882480.80492: Calling groups_plugins_play to load vars for managed_node2 25052 1726882480.81573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882480.83034: done with get_vars() 25052 1726882480.83056: done getting variables 25052 1726882480.83210: variable 'interface' from source: play vars TASK [Get stat for interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:34:40 -0400 (0:00:00.079) 0:00:17.787 ****** 25052 1726882480.83240: entering _queue_task() for managed_node2/stat 25052 1726882480.83574: worker is 1 (out of 1 available) 25052 1726882480.83589: exiting _queue_task() for managed_node2/stat 25052 1726882480.83601: done queuing things up, now waiting for results queue to drain 25052 1726882480.83603: waiting for pending results... 25052 1726882480.84014: running TaskExecutor() for managed_node2/TASK: Get stat for interface veth0 25052 1726882480.84020: in run() - task 12673a56-9f93-f7f6-4a6d-0000000003a0 25052 1726882480.84036: variable 'ansible_search_path' from source: unknown 25052 1726882480.84044: variable 'ansible_search_path' from source: unknown 25052 1726882480.84084: calling self._execute() 25052 1726882480.84181: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.84192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.84210: variable 'omit' from source: magic vars 25052 1726882480.84580: variable 'ansible_distribution_major_version' from source: facts 25052 1726882480.84600: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882480.84612: variable 'omit' from source: magic vars 25052 1726882480.84660: variable 'omit' from source: magic vars 25052 1726882480.84757: variable 'interface' from source: play vars 25052 1726882480.84785: variable 'omit' from source: magic vars 25052 1726882480.84832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882480.84874: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882480.84984: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882480.84987: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882480.84990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882480.84992: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882480.84996: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.84998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.85098: Set connection var ansible_pipelining to False 25052 1726882480.85107: Set connection var ansible_connection to ssh 25052 1726882480.85114: Set connection var ansible_shell_type to sh 25052 1726882480.85127: Set connection var ansible_timeout to 10 25052 1726882480.85140: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882480.85150: Set connection var ansible_shell_executable to /bin/sh 25052 1726882480.85175: variable 'ansible_shell_executable' from source: unknown 25052 1726882480.85184: variable 'ansible_connection' from source: unknown 25052 1726882480.85192: variable 'ansible_module_compression' from source: unknown 25052 1726882480.85206: variable 'ansible_shell_type' from source: unknown 25052 1726882480.85214: variable 'ansible_shell_executable' from source: unknown 25052 1726882480.85222: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882480.85230: variable 'ansible_pipelining' from source: unknown 25052 1726882480.85236: variable 'ansible_timeout' from source: unknown 25052 1726882480.85244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882480.85526: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882480.85530: variable 'omit' from source: magic vars 25052 1726882480.85533: starting attempt loop 25052 1726882480.85536: running the handler 25052 1726882480.85538: _low_level_execute_command(): starting 25052 1726882480.85540: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882480.86286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.86308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.86411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.88074: stdout chunk (state=3): >>>/root <<< 25052 1726882480.88236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.88239: stdout chunk (state=3): >>><<< 25052 1726882480.88242: stderr chunk (state=3): >>><<< 25052 1726882480.88366: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.88370: _low_level_execute_command(): starting 25052 1726882480.88373: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967 `" && echo ansible-tmp-1726882480.8827028-25868-97738146387967="` echo /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967 `" ) && sleep 0' 25052 1726882480.88984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.89009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.89024: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.89056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.89162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.89197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.89223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.89324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.91211: stdout chunk (state=3): >>>ansible-tmp-1726882480.8827028-25868-97738146387967=/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967 <<< 25052 1726882480.91403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.91407: stdout chunk (state=3): >>><<< 25052 1726882480.91410: stderr chunk (state=3): >>><<< 25052 1726882480.91412: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882480.8827028-25868-97738146387967=/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.91460: variable 'ansible_module_compression' from source: unknown 25052 1726882480.91531: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25052 1726882480.91571: variable 'ansible_facts' from source: unknown 25052 1726882480.91677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py 25052 1726882480.91858: Sending initial data 25052 1726882480.91861: Sent initial data (152 bytes) 25052 1726882480.92502: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.92506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.92508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.92510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.92572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.92589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.92637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.92697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.94222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882480.94314: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882480.94396: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp_fvgc3vp /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py <<< 25052 1726882480.94400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py" <<< 25052 1726882480.94449: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp_fvgc3vp" to remote "/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py" <<< 25052 1726882480.95233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.95327: stderr chunk (state=3): >>><<< 25052 1726882480.95330: stdout chunk (state=3): >>><<< 25052 1726882480.95385: done transferring module to remote 25052 1726882480.95404: _low_level_execute_command(): starting 25052 1726882480.95413: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/ /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py && sleep 0' 25052 1726882480.96115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.96131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.96148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.96167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.96230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882480.96295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882480.96341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.96344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.96599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882480.98198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882480.98267: stderr chunk (state=3): >>><<< 25052 1726882480.98276: stdout chunk (state=3): >>><<< 25052 1726882480.98305: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882480.98313: _low_level_execute_command(): starting 25052 1726882480.98322: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/AnsiballZ_stat.py && sleep 0' 25052 1726882480.98949: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882480.98965: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.99017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882480.99031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882480.99135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882480.99139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882480.99170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882480.99289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.14232: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29275, "dev": 23, "nlink": 1, "atime": 1726882469.897565, "mtime": 1726882469.897565, "ctime": 1726882469.897565, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25052 1726882481.15898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882481.15903: stdout chunk (state=3): >>><<< 25052 1726882481.15905: stderr chunk (state=3): >>><<< 25052 1726882481.15907: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/veth0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29275, "dev": 23, "nlink": 1, "atime": 1726882469.897565, "mtime": 1726882469.897565, "ctime": 1726882469.897565, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882481.15909: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882481.15911: _low_level_execute_command(): starting 25052 1726882481.15913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882480.8827028-25868-97738146387967/ > /dev/null 2>&1 && sleep 0' 25052 1726882481.16531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882481.16546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882481.16610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.16673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.16704: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.16715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.16816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.18999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.19002: stdout chunk (state=3): >>><<< 25052 1726882481.19004: stderr chunk (state=3): >>><<< 25052 1726882481.19007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882481.19199: handler run complete 25052 1726882481.19203: attempt loop complete, returning result 25052 1726882481.19206: _execute() done 25052 1726882481.19208: dumping result to json 25052 1726882481.19210: done dumping result, returning 25052 1726882481.19212: done running TaskExecutor() for managed_node2/TASK: Get stat for interface veth0 [12673a56-9f93-f7f6-4a6d-0000000003a0] 25052 1726882481.19214: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003a0 25052 1726882481.19299: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003a0 25052 1726882481.19303: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882469.897565, "block_size": 4096, "blocks": 0, "ctime": 1726882469.897565, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29275, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/veth0", "lnk_target": "../../devices/virtual/net/veth0", "mode": "0777", "mtime": 1726882469.897565, "nlink": 1, "path": "/sys/class/net/veth0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 25052 1726882481.19394: no more pending results, returning what we have 25052 1726882481.19399: results queue empty 25052 1726882481.19400: checking for any_errors_fatal 25052 1726882481.19401: done checking for any_errors_fatal 25052 1726882481.19402: checking for max_fail_percentage 25052 1726882481.19404: done checking for max_fail_percentage 25052 1726882481.19405: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.19406: done checking to see if all hosts have failed 25052 1726882481.19407: getting the remaining hosts for this loop 25052 1726882481.19408: done getting the remaining hosts for this loop 25052 1726882481.19412: getting the next task for host managed_node2 25052 1726882481.19422: done getting next task for host managed_node2 25052 1726882481.19424: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 25052 1726882481.19427: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.19433: getting variables 25052 1726882481.19435: in VariableManager get_vars() 25052 1726882481.19478: Calling all_inventory to load vars for managed_node2 25052 1726882481.19481: Calling groups_inventory to load vars for managed_node2 25052 1726882481.19484: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.19614: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.19620: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.19623: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.26569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.29055: done with get_vars() 25052 1726882481.29083: done getting variables 25052 1726882481.29176: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 25052 1726882481.29282: variable 'interface' from source: play vars TASK [Assert that the interface is present - 'veth0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:34:41 -0400 (0:00:00.460) 0:00:18.248 ****** 25052 1726882481.29319: entering _queue_task() for managed_node2/assert 25052 1726882481.29320: Creating lock for assert 25052 1726882481.29689: worker is 1 (out of 1 available) 25052 1726882481.29865: exiting _queue_task() for managed_node2/assert 25052 1726882481.29874: done queuing things up, now waiting for results queue to drain 25052 1726882481.29875: waiting for pending results... 25052 1726882481.30281: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'veth0' 25052 1726882481.30287: in run() - task 12673a56-9f93-f7f6-4a6d-0000000002b6 25052 1726882481.30290: variable 'ansible_search_path' from source: unknown 25052 1726882481.30297: variable 'ansible_search_path' from source: unknown 25052 1726882481.30300: calling self._execute() 25052 1726882481.30303: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.30306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.30527: variable 'omit' from source: magic vars 25052 1726882481.31124: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.31129: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.31131: variable 'omit' from source: magic vars 25052 1726882481.31133: variable 'omit' from source: magic vars 25052 1726882481.31140: variable 'interface' from source: play vars 25052 1726882481.31218: variable 'omit' from source: magic vars 25052 1726882481.31258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882481.31306: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882481.31326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882481.31345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.31358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.31398: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882481.31401: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.31404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.31527: Set connection var ansible_pipelining to False 25052 1726882481.31531: Set connection var ansible_connection to ssh 25052 1726882481.31533: Set connection var ansible_shell_type to sh 25052 1726882481.31540: Set connection var ansible_timeout to 10 25052 1726882481.31547: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882481.31552: Set connection var ansible_shell_executable to /bin/sh 25052 1726882481.31574: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.31578: variable 'ansible_connection' from source: unknown 25052 1726882481.31580: variable 'ansible_module_compression' from source: unknown 25052 1726882481.31583: variable 'ansible_shell_type' from source: unknown 25052 1726882481.31585: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.31587: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.31589: variable 'ansible_pipelining' from source: unknown 25052 1726882481.31597: variable 'ansible_timeout' from source: unknown 25052 1726882481.31603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.31901: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882481.31905: variable 'omit' from source: magic vars 25052 1726882481.31907: starting attempt loop 25052 1726882481.31910: running the handler 25052 1726882481.32098: variable 'interface_stat' from source: set_fact 25052 1726882481.32101: Evaluated conditional (interface_stat.stat.exists): True 25052 1726882481.32103: handler run complete 25052 1726882481.32104: attempt loop complete, returning result 25052 1726882481.32106: _execute() done 25052 1726882481.32108: dumping result to json 25052 1726882481.32109: done dumping result, returning 25052 1726882481.32111: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'veth0' [12673a56-9f93-f7f6-4a6d-0000000002b6] 25052 1726882481.32113: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000002b6 25052 1726882481.32181: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000002b6 25052 1726882481.32184: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882481.32239: no more pending results, returning what we have 25052 1726882481.32243: results queue empty 25052 1726882481.32244: checking for any_errors_fatal 25052 1726882481.32263: done checking for any_errors_fatal 25052 1726882481.32264: checking for max_fail_percentage 25052 1726882481.32266: done checking for max_fail_percentage 25052 1726882481.32267: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.32268: done checking to see if all hosts have failed 25052 1726882481.32269: getting the remaining hosts for this loop 25052 1726882481.32271: done getting the remaining hosts for this loop 25052 1726882481.32274: getting the next task for host managed_node2 25052 1726882481.32283: done getting next task for host managed_node2 25052 1726882481.32287: ^ task is: TASK: Include the task 'assert_profile_present.yml' 25052 1726882481.32289: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.32364: getting variables 25052 1726882481.32367: in VariableManager get_vars() 25052 1726882481.32411: Calling all_inventory to load vars for managed_node2 25052 1726882481.32414: Calling groups_inventory to load vars for managed_node2 25052 1726882481.32418: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.32428: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.32432: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.32434: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.34638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.37989: done with get_vars() 25052 1726882481.38259: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:49 Friday 20 September 2024 21:34:41 -0400 (0:00:00.090) 0:00:18.338 ****** 25052 1726882481.38356: entering _queue_task() for managed_node2/include_tasks 25052 1726882481.38848: worker is 1 (out of 1 available) 25052 1726882481.38860: exiting _queue_task() for managed_node2/include_tasks 25052 1726882481.38870: done queuing things up, now waiting for results queue to drain 25052 1726882481.38871: waiting for pending results... 25052 1726882481.39149: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' 25052 1726882481.39232: in run() - task 12673a56-9f93-f7f6-4a6d-00000000005d 25052 1726882481.39244: variable 'ansible_search_path' from source: unknown 25052 1726882481.39284: calling self._execute() 25052 1726882481.39373: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.39377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.39390: variable 'omit' from source: magic vars 25052 1726882481.39870: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.39873: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.39875: _execute() done 25052 1726882481.39877: dumping result to json 25052 1726882481.39878: done dumping result, returning 25052 1726882481.39880: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-f7f6-4a6d-00000000005d] 25052 1726882481.39882: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005d 25052 1726882481.39951: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005d 25052 1726882481.39953: WORKER PROCESS EXITING 25052 1726882481.39989: no more pending results, returning what we have 25052 1726882481.39997: in VariableManager get_vars() 25052 1726882481.40039: Calling all_inventory to load vars for managed_node2 25052 1726882481.40041: Calling groups_inventory to load vars for managed_node2 25052 1726882481.40044: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.40055: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.40058: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.40060: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.41449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.43015: done with get_vars() 25052 1726882481.43035: variable 'ansible_search_path' from source: unknown 25052 1726882481.43051: we have included files to process 25052 1726882481.43052: generating all_blocks data 25052 1726882481.43054: done generating all_blocks data 25052 1726882481.43058: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25052 1726882481.43059: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25052 1726882481.43062: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 25052 1726882481.43255: in VariableManager get_vars() 25052 1726882481.43278: done with get_vars() 25052 1726882481.43520: done processing included file 25052 1726882481.43522: iterating over new_blocks loaded from include file 25052 1726882481.43523: in VariableManager get_vars() 25052 1726882481.43538: done with get_vars() 25052 1726882481.43540: filtering new block on tags 25052 1726882481.43556: done filtering new block on tags 25052 1726882481.43558: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 25052 1726882481.43562: extending task lists for all hosts with included blocks 25052 1726882481.46386: done extending task lists 25052 1726882481.46388: done processing included files 25052 1726882481.46389: results queue empty 25052 1726882481.46389: checking for any_errors_fatal 25052 1726882481.46402: done checking for any_errors_fatal 25052 1726882481.46404: checking for max_fail_percentage 25052 1726882481.46405: done checking for max_fail_percentage 25052 1726882481.46406: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.46407: done checking to see if all hosts have failed 25052 1726882481.46408: getting the remaining hosts for this loop 25052 1726882481.46409: done getting the remaining hosts for this loop 25052 1726882481.46412: getting the next task for host managed_node2 25052 1726882481.46416: done getting next task for host managed_node2 25052 1726882481.46420: ^ task is: TASK: Include the task 'get_profile_stat.yml' 25052 1726882481.46422: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.46425: getting variables 25052 1726882481.46426: in VariableManager get_vars() 25052 1726882481.46444: Calling all_inventory to load vars for managed_node2 25052 1726882481.46446: Calling groups_inventory to load vars for managed_node2 25052 1726882481.46448: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.46461: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.46465: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.46469: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.47686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.49238: done with get_vars() 25052 1726882481.49262: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:34:41 -0400 (0:00:00.109) 0:00:18.448 ****** 25052 1726882481.49347: entering _queue_task() for managed_node2/include_tasks 25052 1726882481.49689: worker is 1 (out of 1 available) 25052 1726882481.49806: exiting _queue_task() for managed_node2/include_tasks 25052 1726882481.49818: done queuing things up, now waiting for results queue to drain 25052 1726882481.49819: waiting for pending results... 25052 1726882481.50063: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 25052 1726882481.50190: in run() - task 12673a56-9f93-f7f6-4a6d-0000000003b8 25052 1726882481.50198: variable 'ansible_search_path' from source: unknown 25052 1726882481.50201: variable 'ansible_search_path' from source: unknown 25052 1726882481.50205: calling self._execute() 25052 1726882481.50275: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.50278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.50289: variable 'omit' from source: magic vars 25052 1726882481.50674: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.50686: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.50697: _execute() done 25052 1726882481.50700: dumping result to json 25052 1726882481.50703: done dumping result, returning 25052 1726882481.50705: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-f7f6-4a6d-0000000003b8] 25052 1726882481.50712: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003b8 25052 1726882481.50806: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003b8 25052 1726882481.50809: WORKER PROCESS EXITING 25052 1726882481.50927: no more pending results, returning what we have 25052 1726882481.50934: in VariableManager get_vars() 25052 1726882481.50984: Calling all_inventory to load vars for managed_node2 25052 1726882481.50987: Calling groups_inventory to load vars for managed_node2 25052 1726882481.50990: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.51007: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.51010: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.51014: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.52436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.53662: done with get_vars() 25052 1726882481.53675: variable 'ansible_search_path' from source: unknown 25052 1726882481.53676: variable 'ansible_search_path' from source: unknown 25052 1726882481.53704: we have included files to process 25052 1726882481.53705: generating all_blocks data 25052 1726882481.53706: done generating all_blocks data 25052 1726882481.53707: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25052 1726882481.53708: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25052 1726882481.53709: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 25052 1726882481.54380: done processing included file 25052 1726882481.54382: iterating over new_blocks loaded from include file 25052 1726882481.54383: in VariableManager get_vars() 25052 1726882481.54399: done with get_vars() 25052 1726882481.54400: filtering new block on tags 25052 1726882481.54415: done filtering new block on tags 25052 1726882481.54417: in VariableManager get_vars() 25052 1726882481.54429: done with get_vars() 25052 1726882481.54430: filtering new block on tags 25052 1726882481.54444: done filtering new block on tags 25052 1726882481.54445: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 25052 1726882481.54449: extending task lists for all hosts with included blocks 25052 1726882481.54548: done extending task lists 25052 1726882481.54549: done processing included files 25052 1726882481.54549: results queue empty 25052 1726882481.54550: checking for any_errors_fatal 25052 1726882481.54552: done checking for any_errors_fatal 25052 1726882481.54553: checking for max_fail_percentage 25052 1726882481.54554: done checking for max_fail_percentage 25052 1726882481.54554: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.54555: done checking to see if all hosts have failed 25052 1726882481.54555: getting the remaining hosts for this loop 25052 1726882481.54556: done getting the remaining hosts for this loop 25052 1726882481.54557: getting the next task for host managed_node2 25052 1726882481.54560: done getting next task for host managed_node2 25052 1726882481.54562: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 25052 1726882481.54563: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.54565: getting variables 25052 1726882481.54565: in VariableManager get_vars() 25052 1726882481.54615: Calling all_inventory to load vars for managed_node2 25052 1726882481.54617: Calling groups_inventory to load vars for managed_node2 25052 1726882481.54618: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.54623: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.54624: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.54626: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.55638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.56602: done with get_vars() 25052 1726882481.56623: done getting variables 25052 1726882481.56653: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:34:41 -0400 (0:00:00.073) 0:00:18.521 ****** 25052 1726882481.56676: entering _queue_task() for managed_node2/set_fact 25052 1726882481.56941: worker is 1 (out of 1 available) 25052 1726882481.56954: exiting _queue_task() for managed_node2/set_fact 25052 1726882481.56966: done queuing things up, now waiting for results queue to drain 25052 1726882481.56968: waiting for pending results... 25052 1726882481.57163: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 25052 1726882481.57246: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b0 25052 1726882481.57260: variable 'ansible_search_path' from source: unknown 25052 1726882481.57263: variable 'ansible_search_path' from source: unknown 25052 1726882481.57290: calling self._execute() 25052 1726882481.57366: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.57370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.57378: variable 'omit' from source: magic vars 25052 1726882481.57651: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.57660: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.57666: variable 'omit' from source: magic vars 25052 1726882481.57700: variable 'omit' from source: magic vars 25052 1726882481.57741: variable 'omit' from source: magic vars 25052 1726882481.57786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882481.57870: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882481.57873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882481.57877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.57879: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.57918: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882481.57921: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.57923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.58043: Set connection var ansible_pipelining to False 25052 1726882481.58046: Set connection var ansible_connection to ssh 25052 1726882481.58049: Set connection var ansible_shell_type to sh 25052 1726882481.58051: Set connection var ansible_timeout to 10 25052 1726882481.58053: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882481.58055: Set connection var ansible_shell_executable to /bin/sh 25052 1726882481.58111: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.58114: variable 'ansible_connection' from source: unknown 25052 1726882481.58117: variable 'ansible_module_compression' from source: unknown 25052 1726882481.58121: variable 'ansible_shell_type' from source: unknown 25052 1726882481.58123: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.58126: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.58128: variable 'ansible_pipelining' from source: unknown 25052 1726882481.58130: variable 'ansible_timeout' from source: unknown 25052 1726882481.58132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.58258: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882481.58263: variable 'omit' from source: magic vars 25052 1726882481.58265: starting attempt loop 25052 1726882481.58268: running the handler 25052 1726882481.58270: handler run complete 25052 1726882481.58273: attempt loop complete, returning result 25052 1726882481.58275: _execute() done 25052 1726882481.58277: dumping result to json 25052 1726882481.58279: done dumping result, returning 25052 1726882481.58282: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-f7f6-4a6d-0000000004b0] 25052 1726882481.58286: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b0 25052 1726882481.58434: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b0 25052 1726882481.58437: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 25052 1726882481.58524: no more pending results, returning what we have 25052 1726882481.58527: results queue empty 25052 1726882481.58528: checking for any_errors_fatal 25052 1726882481.58529: done checking for any_errors_fatal 25052 1726882481.58530: checking for max_fail_percentage 25052 1726882481.58532: done checking for max_fail_percentage 25052 1726882481.58532: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.58533: done checking to see if all hosts have failed 25052 1726882481.58534: getting the remaining hosts for this loop 25052 1726882481.58535: done getting the remaining hosts for this loop 25052 1726882481.58538: getting the next task for host managed_node2 25052 1726882481.58544: done getting next task for host managed_node2 25052 1726882481.58546: ^ task is: TASK: Stat profile file 25052 1726882481.58549: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.58553: getting variables 25052 1726882481.58554: in VariableManager get_vars() 25052 1726882481.58586: Calling all_inventory to load vars for managed_node2 25052 1726882481.58588: Calling groups_inventory to load vars for managed_node2 25052 1726882481.58590: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.58603: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.58605: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.58607: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.60361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.62005: done with get_vars() 25052 1726882481.62031: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:34:41 -0400 (0:00:00.054) 0:00:18.576 ****** 25052 1726882481.62135: entering _queue_task() for managed_node2/stat 25052 1726882481.62513: worker is 1 (out of 1 available) 25052 1726882481.62525: exiting _queue_task() for managed_node2/stat 25052 1726882481.62537: done queuing things up, now waiting for results queue to drain 25052 1726882481.62539: waiting for pending results... 25052 1726882481.62954: running TaskExecutor() for managed_node2/TASK: Stat profile file 25052 1726882481.63018: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b1 25052 1726882481.63100: variable 'ansible_search_path' from source: unknown 25052 1726882481.63104: variable 'ansible_search_path' from source: unknown 25052 1726882481.63106: calling self._execute() 25052 1726882481.63201: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.63234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.63249: variable 'omit' from source: magic vars 25052 1726882481.64037: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.64290: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.64297: variable 'omit' from source: magic vars 25052 1726882481.64300: variable 'omit' from source: magic vars 25052 1726882481.64400: variable 'profile' from source: include params 25052 1726882481.64517: variable 'interface' from source: play vars 25052 1726882481.64599: variable 'interface' from source: play vars 25052 1726882481.64743: variable 'omit' from source: magic vars 25052 1726882481.64788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882481.64848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882481.64874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882481.64887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.64903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882481.64941: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882481.64944: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.64947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.65021: Set connection var ansible_pipelining to False 25052 1726882481.65024: Set connection var ansible_connection to ssh 25052 1726882481.65026: Set connection var ansible_shell_type to sh 25052 1726882481.65032: Set connection var ansible_timeout to 10 25052 1726882481.65038: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882481.65052: Set connection var ansible_shell_executable to /bin/sh 25052 1726882481.65067: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.65069: variable 'ansible_connection' from source: unknown 25052 1726882481.65072: variable 'ansible_module_compression' from source: unknown 25052 1726882481.65074: variable 'ansible_shell_type' from source: unknown 25052 1726882481.65077: variable 'ansible_shell_executable' from source: unknown 25052 1726882481.65079: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.65083: variable 'ansible_pipelining' from source: unknown 25052 1726882481.65086: variable 'ansible_timeout' from source: unknown 25052 1726882481.65089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.65253: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882481.65271: variable 'omit' from source: magic vars 25052 1726882481.65274: starting attempt loop 25052 1726882481.65276: running the handler 25052 1726882481.65282: _low_level_execute_command(): starting 25052 1726882481.65289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882481.65784: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882481.65788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882481.65791: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882481.65796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.65848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.65851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.65857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.65928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.67626: stdout chunk (state=3): >>>/root <<< 25052 1726882481.67745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.67749: stdout chunk (state=3): >>><<< 25052 1726882481.67759: stderr chunk (state=3): >>><<< 25052 1726882481.67783: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882481.67797: _low_level_execute_command(): starting 25052 1726882481.67801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632 `" && echo ansible-tmp-1726882481.6778212-25915-246894011399632="` echo /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632 `" ) && sleep 0' 25052 1726882481.68231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882481.68234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882481.68244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882481.68247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.68281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.68284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.68355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.70233: stdout chunk (state=3): >>>ansible-tmp-1726882481.6778212-25915-246894011399632=/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632 <<< 25052 1726882481.70398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.70401: stdout chunk (state=3): >>><<< 25052 1726882481.70404: stderr chunk (state=3): >>><<< 25052 1726882481.70500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882481.6778212-25915-246894011399632=/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882481.70503: variable 'ansible_module_compression' from source: unknown 25052 1726882481.70552: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 25052 1726882481.70597: variable 'ansible_facts' from source: unknown 25052 1726882481.70692: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py 25052 1726882481.70853: Sending initial data 25052 1726882481.70864: Sent initial data (153 bytes) 25052 1726882481.71459: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882481.71504: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882481.71586: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882481.71601: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.71653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.71719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.73276: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882481.73334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882481.73418: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp7omi_ce8 /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py <<< 25052 1726882481.73421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py" <<< 25052 1726882481.73487: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp7omi_ce8" to remote "/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py" <<< 25052 1726882481.74310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.74348: stderr chunk (state=3): >>><<< 25052 1726882481.74360: stdout chunk (state=3): >>><<< 25052 1726882481.74459: done transferring module to remote 25052 1726882481.74462: _low_level_execute_command(): starting 25052 1726882481.74464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/ /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py && sleep 0' 25052 1726882481.75220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.75245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.75262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.75281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.75368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.77148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.77160: stdout chunk (state=3): >>><<< 25052 1726882481.77172: stderr chunk (state=3): >>><<< 25052 1726882481.77209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882481.77213: _low_level_execute_command(): starting 25052 1726882481.77288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/AnsiballZ_stat.py && sleep 0' 25052 1726882481.77758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882481.77764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882481.77792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882481.77799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882481.77801: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882481.77803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.77853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.77857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.77866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.77937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.92969: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 25052 1726882481.94330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882481.94356: stderr chunk (state=3): >>><<< 25052 1726882481.94359: stdout chunk (state=3): >>><<< 25052 1726882481.94375: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-veth0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882481.94404: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882481.94412: _low_level_execute_command(): starting 25052 1726882481.94419: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882481.6778212-25915-246894011399632/ > /dev/null 2>&1 && sleep 0' 25052 1726882481.94860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882481.94897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882481.94901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.94904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882481.94906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882481.94955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882481.94958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882481.94962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882481.95024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882481.96877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882481.96907: stderr chunk (state=3): >>><<< 25052 1726882481.96910: stdout chunk (state=3): >>><<< 25052 1726882481.96922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882481.96930: handler run complete 25052 1726882481.96945: attempt loop complete, returning result 25052 1726882481.96947: _execute() done 25052 1726882481.96950: dumping result to json 25052 1726882481.96952: done dumping result, returning 25052 1726882481.96960: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-f7f6-4a6d-0000000004b1] 25052 1726882481.96964: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b1 25052 1726882481.97057: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b1 25052 1726882481.97060: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 25052 1726882481.97117: no more pending results, returning what we have 25052 1726882481.97121: results queue empty 25052 1726882481.97121: checking for any_errors_fatal 25052 1726882481.97128: done checking for any_errors_fatal 25052 1726882481.97129: checking for max_fail_percentage 25052 1726882481.97130: done checking for max_fail_percentage 25052 1726882481.97131: checking to see if all hosts have failed and the running result is not ok 25052 1726882481.97132: done checking to see if all hosts have failed 25052 1726882481.97133: getting the remaining hosts for this loop 25052 1726882481.97134: done getting the remaining hosts for this loop 25052 1726882481.97138: getting the next task for host managed_node2 25052 1726882481.97146: done getting next task for host managed_node2 25052 1726882481.97148: ^ task is: TASK: Set NM profile exist flag based on the profile files 25052 1726882481.97152: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882481.97157: getting variables 25052 1726882481.97158: in VariableManager get_vars() 25052 1726882481.97200: Calling all_inventory to load vars for managed_node2 25052 1726882481.97204: Calling groups_inventory to load vars for managed_node2 25052 1726882481.97206: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882481.97217: Calling all_plugins_play to load vars for managed_node2 25052 1726882481.97220: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882481.97223: Calling groups_plugins_play to load vars for managed_node2 25052 1726882481.98052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882481.98918: done with get_vars() 25052 1726882481.98936: done getting variables 25052 1726882481.98977: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:34:41 -0400 (0:00:00.368) 0:00:18.944 ****** 25052 1726882481.99004: entering _queue_task() for managed_node2/set_fact 25052 1726882481.99238: worker is 1 (out of 1 available) 25052 1726882481.99251: exiting _queue_task() for managed_node2/set_fact 25052 1726882481.99261: done queuing things up, now waiting for results queue to drain 25052 1726882481.99262: waiting for pending results... 25052 1726882481.99432: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 25052 1726882481.99503: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b2 25052 1726882481.99515: variable 'ansible_search_path' from source: unknown 25052 1726882481.99519: variable 'ansible_search_path' from source: unknown 25052 1726882481.99545: calling self._execute() 25052 1726882481.99619: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882481.99622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882481.99631: variable 'omit' from source: magic vars 25052 1726882481.99887: variable 'ansible_distribution_major_version' from source: facts 25052 1726882481.99898: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882481.99981: variable 'profile_stat' from source: set_fact 25052 1726882481.99995: Evaluated conditional (profile_stat.stat.exists): False 25052 1726882481.99998: when evaluation is False, skipping this task 25052 1726882482.00001: _execute() done 25052 1726882482.00003: dumping result to json 25052 1726882482.00006: done dumping result, returning 25052 1726882482.00009: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-f7f6-4a6d-0000000004b2] 25052 1726882482.00015: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b2 25052 1726882482.00098: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b2 25052 1726882482.00101: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25052 1726882482.00183: no more pending results, returning what we have 25052 1726882482.00186: results queue empty 25052 1726882482.00187: checking for any_errors_fatal 25052 1726882482.00198: done checking for any_errors_fatal 25052 1726882482.00199: checking for max_fail_percentage 25052 1726882482.00200: done checking for max_fail_percentage 25052 1726882482.00201: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.00202: done checking to see if all hosts have failed 25052 1726882482.00202: getting the remaining hosts for this loop 25052 1726882482.00203: done getting the remaining hosts for this loop 25052 1726882482.00206: getting the next task for host managed_node2 25052 1726882482.00212: done getting next task for host managed_node2 25052 1726882482.00214: ^ task is: TASK: Get NM profile info 25052 1726882482.00218: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.00222: getting variables 25052 1726882482.00223: in VariableManager get_vars() 25052 1726882482.00256: Calling all_inventory to load vars for managed_node2 25052 1726882482.00258: Calling groups_inventory to load vars for managed_node2 25052 1726882482.00260: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.00269: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.00271: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.00273: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.01112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.01987: done with get_vars() 25052 1726882482.02004: done getting variables 25052 1726882482.02048: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:34:42 -0400 (0:00:00.030) 0:00:18.975 ****** 25052 1726882482.02069: entering _queue_task() for managed_node2/shell 25052 1726882482.02291: worker is 1 (out of 1 available) 25052 1726882482.02306: exiting _queue_task() for managed_node2/shell 25052 1726882482.02317: done queuing things up, now waiting for results queue to drain 25052 1726882482.02318: waiting for pending results... 25052 1726882482.02809: running TaskExecutor() for managed_node2/TASK: Get NM profile info 25052 1726882482.02815: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b3 25052 1726882482.02819: variable 'ansible_search_path' from source: unknown 25052 1726882482.02822: variable 'ansible_search_path' from source: unknown 25052 1726882482.02825: calling self._execute() 25052 1726882482.02827: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.02830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.02832: variable 'omit' from source: magic vars 25052 1726882482.03187: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.03211: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.03218: variable 'omit' from source: magic vars 25052 1726882482.03262: variable 'omit' from source: magic vars 25052 1726882482.03361: variable 'profile' from source: include params 25052 1726882482.03365: variable 'interface' from source: play vars 25052 1726882482.03426: variable 'interface' from source: play vars 25052 1726882482.03441: variable 'omit' from source: magic vars 25052 1726882482.03473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.03511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.03538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.03551: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.03562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.03587: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.03597: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.03602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.03670: Set connection var ansible_pipelining to False 25052 1726882482.03673: Set connection var ansible_connection to ssh 25052 1726882482.03676: Set connection var ansible_shell_type to sh 25052 1726882482.03681: Set connection var ansible_timeout to 10 25052 1726882482.03687: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.03692: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.03717: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.03720: variable 'ansible_connection' from source: unknown 25052 1726882482.03722: variable 'ansible_module_compression' from source: unknown 25052 1726882482.03724: variable 'ansible_shell_type' from source: unknown 25052 1726882482.03727: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.03729: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.03731: variable 'ansible_pipelining' from source: unknown 25052 1726882482.03734: variable 'ansible_timeout' from source: unknown 25052 1726882482.03738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.03858: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.03866: variable 'omit' from source: magic vars 25052 1726882482.03870: starting attempt loop 25052 1726882482.03872: running the handler 25052 1726882482.03881: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.03900: _low_level_execute_command(): starting 25052 1726882482.03907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882482.04375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.04414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882482.04418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.04422: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.04469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.04472: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.04474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.04545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.06176: stdout chunk (state=3): >>>/root <<< 25052 1726882482.06274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.06306: stderr chunk (state=3): >>><<< 25052 1726882482.06310: stdout chunk (state=3): >>><<< 25052 1726882482.06329: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.06344: _low_level_execute_command(): starting 25052 1726882482.06349: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053 `" && echo ansible-tmp-1726882482.0632927-25933-69048831723053="` echo /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053 `" ) && sleep 0' 25052 1726882482.06765: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.06808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.06811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.06813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882482.06816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882482.06818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.06860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.06863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.06867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.06930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.08789: stdout chunk (state=3): >>>ansible-tmp-1726882482.0632927-25933-69048831723053=/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053 <<< 25052 1726882482.08899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.08925: stderr chunk (state=3): >>><<< 25052 1726882482.08928: stdout chunk (state=3): >>><<< 25052 1726882482.08943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882482.0632927-25933-69048831723053=/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.08972: variable 'ansible_module_compression' from source: unknown 25052 1726882482.09013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882482.09043: variable 'ansible_facts' from source: unknown 25052 1726882482.09103: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py 25052 1726882482.09201: Sending initial data 25052 1726882482.09204: Sent initial data (155 bytes) 25052 1726882482.09642: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.09645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.09648: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.09650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.09701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.09705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.09767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.11306: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25052 1726882482.11310: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882482.11377: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882482.11458: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpjyn49cdn /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py <<< 25052 1726882482.11464: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py" <<< 25052 1726882482.11522: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpjyn49cdn" to remote "/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py" <<< 25052 1726882482.12250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.12299: stderr chunk (state=3): >>><<< 25052 1726882482.12366: stdout chunk (state=3): >>><<< 25052 1726882482.12374: done transferring module to remote 25052 1726882482.12387: _low_level_execute_command(): starting 25052 1726882482.12398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/ /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py && sleep 0' 25052 1726882482.12989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882482.13006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.13030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.13046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.13059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882482.13140: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.13164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.13178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.13200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.13283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.14989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.15019: stderr chunk (state=3): >>><<< 25052 1726882482.15022: stdout chunk (state=3): >>><<< 25052 1726882482.15035: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.15039: _low_level_execute_command(): starting 25052 1726882482.15042: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/AnsiballZ_command.py && sleep 0' 25052 1726882482.15460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.15499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.15502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882482.15504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.15507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.15509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.15554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.15557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.15561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.15631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.32252: stdout chunk (state=3): >>> {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:34:42.304335", "end": "2024-09-20 21:34:42.321552", "delta": "0:00:00.017217", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882482.33635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882482.33660: stderr chunk (state=3): >>><<< 25052 1726882482.33663: stdout chunk (state=3): >>><<< 25052 1726882482.33678: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "veth0 /etc/NetworkManager/system-connections/veth0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "start": "2024-09-20 21:34:42.304335", "end": "2024-09-20 21:34:42.321552", "delta": "0:00:00.017217", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882482.33714: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882482.33723: _low_level_execute_command(): starting 25052 1726882482.33726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882482.0632927-25933-69048831723053/ > /dev/null 2>&1 && sleep 0' 25052 1726882482.34170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.34174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882482.34176: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882482.34178: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.34230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.34233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.34240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.34299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.36074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.36097: stderr chunk (state=3): >>><<< 25052 1726882482.36105: stdout chunk (state=3): >>><<< 25052 1726882482.36118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.36124: handler run complete 25052 1726882482.36141: Evaluated conditional (False): False 25052 1726882482.36148: attempt loop complete, returning result 25052 1726882482.36151: _execute() done 25052 1726882482.36153: dumping result to json 25052 1726882482.36158: done dumping result, returning 25052 1726882482.36167: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-f7f6-4a6d-0000000004b3] 25052 1726882482.36169: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b3 25052 1726882482.36262: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b3 25052 1726882482.36264: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep veth0 | grep /etc", "delta": "0:00:00.017217", "end": "2024-09-20 21:34:42.321552", "rc": 0, "start": "2024-09-20 21:34:42.304335" } STDOUT: veth0 /etc/NetworkManager/system-connections/veth0.nmconnection 25052 1726882482.36335: no more pending results, returning what we have 25052 1726882482.36338: results queue empty 25052 1726882482.36339: checking for any_errors_fatal 25052 1726882482.36345: done checking for any_errors_fatal 25052 1726882482.36346: checking for max_fail_percentage 25052 1726882482.36347: done checking for max_fail_percentage 25052 1726882482.36348: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.36349: done checking to see if all hosts have failed 25052 1726882482.36350: getting the remaining hosts for this loop 25052 1726882482.36351: done getting the remaining hosts for this loop 25052 1726882482.36354: getting the next task for host managed_node2 25052 1726882482.36361: done getting next task for host managed_node2 25052 1726882482.36363: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25052 1726882482.36367: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.36371: getting variables 25052 1726882482.36373: in VariableManager get_vars() 25052 1726882482.36414: Calling all_inventory to load vars for managed_node2 25052 1726882482.36417: Calling groups_inventory to load vars for managed_node2 25052 1726882482.36420: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.36430: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.36433: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.36436: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.37232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.38188: done with get_vars() 25052 1726882482.38206: done getting variables 25052 1726882482.38250: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:34:42 -0400 (0:00:00.362) 0:00:19.337 ****** 25052 1726882482.38272: entering _queue_task() for managed_node2/set_fact 25052 1726882482.38495: worker is 1 (out of 1 available) 25052 1726882482.38510: exiting _queue_task() for managed_node2/set_fact 25052 1726882482.38523: done queuing things up, now waiting for results queue to drain 25052 1726882482.38524: waiting for pending results... 25052 1726882482.38681: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 25052 1726882482.39000: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b4 25052 1726882482.39004: variable 'ansible_search_path' from source: unknown 25052 1726882482.39007: variable 'ansible_search_path' from source: unknown 25052 1726882482.39010: calling self._execute() 25052 1726882482.39013: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.39015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.39018: variable 'omit' from source: magic vars 25052 1726882482.39281: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.39299: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.39429: variable 'nm_profile_exists' from source: set_fact 25052 1726882482.39445: Evaluated conditional (nm_profile_exists.rc == 0): True 25052 1726882482.39452: variable 'omit' from source: magic vars 25052 1726882482.39500: variable 'omit' from source: magic vars 25052 1726882482.39533: variable 'omit' from source: magic vars 25052 1726882482.39574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.39700: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.39703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.39705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.39708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.39710: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.39713: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.39715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.39807: Set connection var ansible_pipelining to False 25052 1726882482.39811: Set connection var ansible_connection to ssh 25052 1726882482.39814: Set connection var ansible_shell_type to sh 25052 1726882482.39821: Set connection var ansible_timeout to 10 25052 1726882482.39835: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.39842: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.39854: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.39857: variable 'ansible_connection' from source: unknown 25052 1726882482.39861: variable 'ansible_module_compression' from source: unknown 25052 1726882482.39863: variable 'ansible_shell_type' from source: unknown 25052 1726882482.39866: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.39868: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.39870: variable 'ansible_pipelining' from source: unknown 25052 1726882482.39872: variable 'ansible_timeout' from source: unknown 25052 1726882482.39951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.40026: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.40036: variable 'omit' from source: magic vars 25052 1726882482.40041: starting attempt loop 25052 1726882482.40045: running the handler 25052 1726882482.40058: handler run complete 25052 1726882482.40065: attempt loop complete, returning result 25052 1726882482.40068: _execute() done 25052 1726882482.40071: dumping result to json 25052 1726882482.40073: done dumping result, returning 25052 1726882482.40082: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-f7f6-4a6d-0000000004b4] 25052 1726882482.40085: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b4 ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 25052 1726882482.40349: no more pending results, returning what we have 25052 1726882482.40352: results queue empty 25052 1726882482.40352: checking for any_errors_fatal 25052 1726882482.40358: done checking for any_errors_fatal 25052 1726882482.40359: checking for max_fail_percentage 25052 1726882482.40360: done checking for max_fail_percentage 25052 1726882482.40361: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.40362: done checking to see if all hosts have failed 25052 1726882482.40362: getting the remaining hosts for this loop 25052 1726882482.40363: done getting the remaining hosts for this loop 25052 1726882482.40366: getting the next task for host managed_node2 25052 1726882482.40374: done getting next task for host managed_node2 25052 1726882482.40376: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 25052 1726882482.40380: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.40383: getting variables 25052 1726882482.40384: in VariableManager get_vars() 25052 1726882482.40429: Calling all_inventory to load vars for managed_node2 25052 1726882482.40432: Calling groups_inventory to load vars for managed_node2 25052 1726882482.40435: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.40441: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b4 25052 1726882482.40444: WORKER PROCESS EXITING 25052 1726882482.40452: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.40455: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.40459: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.41697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.43224: done with get_vars() 25052 1726882482.43246: done getting variables 25052 1726882482.43309: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.43427: variable 'profile' from source: include params 25052 1726882482.43432: variable 'interface' from source: play vars 25052 1726882482.43496: variable 'interface' from source: play vars TASK [Get the ansible_managed comment in ifcfg-veth0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:34:42 -0400 (0:00:00.052) 0:00:19.390 ****** 25052 1726882482.43533: entering _queue_task() for managed_node2/command 25052 1726882482.43770: worker is 1 (out of 1 available) 25052 1726882482.43782: exiting _queue_task() for managed_node2/command 25052 1726882482.43797: done queuing things up, now waiting for results queue to drain 25052 1726882482.43798: waiting for pending results... 25052 1726882482.43953: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-veth0 25052 1726882482.44033: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b6 25052 1726882482.44046: variable 'ansible_search_path' from source: unknown 25052 1726882482.44050: variable 'ansible_search_path' from source: unknown 25052 1726882482.44073: calling self._execute() 25052 1726882482.44144: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.44149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.44159: variable 'omit' from source: magic vars 25052 1726882482.44406: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.44416: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.44500: variable 'profile_stat' from source: set_fact 25052 1726882482.44510: Evaluated conditional (profile_stat.stat.exists): False 25052 1726882482.44513: when evaluation is False, skipping this task 25052 1726882482.44515: _execute() done 25052 1726882482.44518: dumping result to json 25052 1726882482.44520: done dumping result, returning 25052 1726882482.44526: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-f7f6-4a6d-0000000004b6] 25052 1726882482.44529: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b6 25052 1726882482.44610: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b6 25052 1726882482.44613: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25052 1726882482.44659: no more pending results, returning what we have 25052 1726882482.44663: results queue empty 25052 1726882482.44663: checking for any_errors_fatal 25052 1726882482.44670: done checking for any_errors_fatal 25052 1726882482.44671: checking for max_fail_percentage 25052 1726882482.44672: done checking for max_fail_percentage 25052 1726882482.44673: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.44674: done checking to see if all hosts have failed 25052 1726882482.44674: getting the remaining hosts for this loop 25052 1726882482.44675: done getting the remaining hosts for this loop 25052 1726882482.44678: getting the next task for host managed_node2 25052 1726882482.44684: done getting next task for host managed_node2 25052 1726882482.44686: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 25052 1726882482.44690: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.44696: getting variables 25052 1726882482.44697: in VariableManager get_vars() 25052 1726882482.44728: Calling all_inventory to load vars for managed_node2 25052 1726882482.44730: Calling groups_inventory to load vars for managed_node2 25052 1726882482.44732: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.44740: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.44743: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.44745: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.45595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.46437: done with get_vars() 25052 1726882482.46451: done getting variables 25052 1726882482.46491: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.46563: variable 'profile' from source: include params 25052 1726882482.46566: variable 'interface' from source: play vars 25052 1726882482.46606: variable 'interface' from source: play vars TASK [Verify the ansible_managed comment in ifcfg-veth0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:34:42 -0400 (0:00:00.030) 0:00:19.421 ****** 25052 1726882482.46627: entering _queue_task() for managed_node2/set_fact 25052 1726882482.46831: worker is 1 (out of 1 available) 25052 1726882482.46844: exiting _queue_task() for managed_node2/set_fact 25052 1726882482.46855: done queuing things up, now waiting for results queue to drain 25052 1726882482.46856: waiting for pending results... 25052 1726882482.47021: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 25052 1726882482.47099: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b7 25052 1726882482.47110: variable 'ansible_search_path' from source: unknown 25052 1726882482.47115: variable 'ansible_search_path' from source: unknown 25052 1726882482.47139: calling self._execute() 25052 1726882482.47210: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.47216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.47225: variable 'omit' from source: magic vars 25052 1726882482.47462: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.47471: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.47553: variable 'profile_stat' from source: set_fact 25052 1726882482.47563: Evaluated conditional (profile_stat.stat.exists): False 25052 1726882482.47566: when evaluation is False, skipping this task 25052 1726882482.47569: _execute() done 25052 1726882482.47571: dumping result to json 25052 1726882482.47573: done dumping result, returning 25052 1726882482.47580: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-veth0 [12673a56-9f93-f7f6-4a6d-0000000004b7] 25052 1726882482.47583: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b7 25052 1726882482.47665: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b7 25052 1726882482.47668: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25052 1726882482.47713: no more pending results, returning what we have 25052 1726882482.47717: results queue empty 25052 1726882482.47718: checking for any_errors_fatal 25052 1726882482.47723: done checking for any_errors_fatal 25052 1726882482.47724: checking for max_fail_percentage 25052 1726882482.47725: done checking for max_fail_percentage 25052 1726882482.47726: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.47727: done checking to see if all hosts have failed 25052 1726882482.47727: getting the remaining hosts for this loop 25052 1726882482.47729: done getting the remaining hosts for this loop 25052 1726882482.47732: getting the next task for host managed_node2 25052 1726882482.47739: done getting next task for host managed_node2 25052 1726882482.47741: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 25052 1726882482.47745: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.47748: getting variables 25052 1726882482.47749: in VariableManager get_vars() 25052 1726882482.47779: Calling all_inventory to load vars for managed_node2 25052 1726882482.47781: Calling groups_inventory to load vars for managed_node2 25052 1726882482.47783: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.47791: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.47796: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.47799: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.48524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.49369: done with get_vars() 25052 1726882482.49383: done getting variables 25052 1726882482.49422: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.49489: variable 'profile' from source: include params 25052 1726882482.49492: variable 'interface' from source: play vars 25052 1726882482.49531: variable 'interface' from source: play vars TASK [Get the fingerprint comment in ifcfg-veth0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:34:42 -0400 (0:00:00.029) 0:00:19.450 ****** 25052 1726882482.49551: entering _queue_task() for managed_node2/command 25052 1726882482.49735: worker is 1 (out of 1 available) 25052 1726882482.49748: exiting _queue_task() for managed_node2/command 25052 1726882482.49760: done queuing things up, now waiting for results queue to drain 25052 1726882482.49762: waiting for pending results... 25052 1726882482.49915: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-veth0 25052 1726882482.49985: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b8 25052 1726882482.49995: variable 'ansible_search_path' from source: unknown 25052 1726882482.50003: variable 'ansible_search_path' from source: unknown 25052 1726882482.50031: calling self._execute() 25052 1726882482.50101: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.50105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.50119: variable 'omit' from source: magic vars 25052 1726882482.50350: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.50360: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.50442: variable 'profile_stat' from source: set_fact 25052 1726882482.50452: Evaluated conditional (profile_stat.stat.exists): False 25052 1726882482.50456: when evaluation is False, skipping this task 25052 1726882482.50458: _execute() done 25052 1726882482.50460: dumping result to json 25052 1726882482.50463: done dumping result, returning 25052 1726882482.50470: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-veth0 [12673a56-9f93-f7f6-4a6d-0000000004b8] 25052 1726882482.50473: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b8 25052 1726882482.50554: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b8 25052 1726882482.50557: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25052 1726882482.50612: no more pending results, returning what we have 25052 1726882482.50616: results queue empty 25052 1726882482.50616: checking for any_errors_fatal 25052 1726882482.50621: done checking for any_errors_fatal 25052 1726882482.50622: checking for max_fail_percentage 25052 1726882482.50623: done checking for max_fail_percentage 25052 1726882482.50624: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.50625: done checking to see if all hosts have failed 25052 1726882482.50626: getting the remaining hosts for this loop 25052 1726882482.50627: done getting the remaining hosts for this loop 25052 1726882482.50629: getting the next task for host managed_node2 25052 1726882482.50635: done getting next task for host managed_node2 25052 1726882482.50637: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 25052 1726882482.50640: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.50643: getting variables 25052 1726882482.50644: in VariableManager get_vars() 25052 1726882482.50675: Calling all_inventory to load vars for managed_node2 25052 1726882482.50677: Calling groups_inventory to load vars for managed_node2 25052 1726882482.50679: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.50686: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.50689: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.50691: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.51516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.52351: done with get_vars() 25052 1726882482.52365: done getting variables 25052 1726882482.52406: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.52474: variable 'profile' from source: include params 25052 1726882482.52477: variable 'interface' from source: play vars 25052 1726882482.52516: variable 'interface' from source: play vars TASK [Verify the fingerprint comment in ifcfg-veth0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:34:42 -0400 (0:00:00.029) 0:00:19.480 ****** 25052 1726882482.52536: entering _queue_task() for managed_node2/set_fact 25052 1726882482.52727: worker is 1 (out of 1 available) 25052 1726882482.52739: exiting _queue_task() for managed_node2/set_fact 25052 1726882482.52752: done queuing things up, now waiting for results queue to drain 25052 1726882482.52753: waiting for pending results... 25052 1726882482.52907: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-veth0 25052 1726882482.52973: in run() - task 12673a56-9f93-f7f6-4a6d-0000000004b9 25052 1726882482.52986: variable 'ansible_search_path' from source: unknown 25052 1726882482.52989: variable 'ansible_search_path' from source: unknown 25052 1726882482.53016: calling self._execute() 25052 1726882482.53080: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.53085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.53101: variable 'omit' from source: magic vars 25052 1726882482.53330: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.53339: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.53421: variable 'profile_stat' from source: set_fact 25052 1726882482.53431: Evaluated conditional (profile_stat.stat.exists): False 25052 1726882482.53435: when evaluation is False, skipping this task 25052 1726882482.53437: _execute() done 25052 1726882482.53439: dumping result to json 25052 1726882482.53442: done dumping result, returning 25052 1726882482.53448: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-veth0 [12673a56-9f93-f7f6-4a6d-0000000004b9] 25052 1726882482.53452: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b9 25052 1726882482.53529: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000004b9 25052 1726882482.53533: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 25052 1726882482.53578: no more pending results, returning what we have 25052 1726882482.53581: results queue empty 25052 1726882482.53582: checking for any_errors_fatal 25052 1726882482.53586: done checking for any_errors_fatal 25052 1726882482.53587: checking for max_fail_percentage 25052 1726882482.53589: done checking for max_fail_percentage 25052 1726882482.53589: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.53590: done checking to see if all hosts have failed 25052 1726882482.53591: getting the remaining hosts for this loop 25052 1726882482.53592: done getting the remaining hosts for this loop 25052 1726882482.53600: getting the next task for host managed_node2 25052 1726882482.53607: done getting next task for host managed_node2 25052 1726882482.53609: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 25052 1726882482.53612: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.53615: getting variables 25052 1726882482.53617: in VariableManager get_vars() 25052 1726882482.53649: Calling all_inventory to load vars for managed_node2 25052 1726882482.53651: Calling groups_inventory to load vars for managed_node2 25052 1726882482.53653: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.53661: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.53664: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.53666: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.54371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.55225: done with get_vars() 25052 1726882482.55239: done getting variables 25052 1726882482.55276: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.55348: variable 'profile' from source: include params 25052 1726882482.55351: variable 'interface' from source: play vars 25052 1726882482.55386: variable 'interface' from source: play vars TASK [Assert that the profile is present - 'veth0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:34:42 -0400 (0:00:00.028) 0:00:19.508 ****** 25052 1726882482.55408: entering _queue_task() for managed_node2/assert 25052 1726882482.55591: worker is 1 (out of 1 available) 25052 1726882482.55605: exiting _queue_task() for managed_node2/assert 25052 1726882482.55618: done queuing things up, now waiting for results queue to drain 25052 1726882482.55620: waiting for pending results... 25052 1726882482.55773: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'veth0' 25052 1726882482.55833: in run() - task 12673a56-9f93-f7f6-4a6d-0000000003b9 25052 1726882482.55848: variable 'ansible_search_path' from source: unknown 25052 1726882482.55851: variable 'ansible_search_path' from source: unknown 25052 1726882482.55875: calling self._execute() 25052 1726882482.55942: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.55945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.55955: variable 'omit' from source: magic vars 25052 1726882482.56200: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.56210: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.56215: variable 'omit' from source: magic vars 25052 1726882482.56239: variable 'omit' from source: magic vars 25052 1726882482.56309: variable 'profile' from source: include params 25052 1726882482.56313: variable 'interface' from source: play vars 25052 1726882482.56358: variable 'interface' from source: play vars 25052 1726882482.56372: variable 'omit' from source: magic vars 25052 1726882482.56409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.56435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.56450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.56463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.56473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.56501: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.56504: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.56506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.56572: Set connection var ansible_pipelining to False 25052 1726882482.56576: Set connection var ansible_connection to ssh 25052 1726882482.56578: Set connection var ansible_shell_type to sh 25052 1726882482.56583: Set connection var ansible_timeout to 10 25052 1726882482.56590: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.56598: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.56617: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.56621: variable 'ansible_connection' from source: unknown 25052 1726882482.56624: variable 'ansible_module_compression' from source: unknown 25052 1726882482.56626: variable 'ansible_shell_type' from source: unknown 25052 1726882482.56628: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.56630: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.56632: variable 'ansible_pipelining' from source: unknown 25052 1726882482.56635: variable 'ansible_timeout' from source: unknown 25052 1726882482.56637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.56735: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.56744: variable 'omit' from source: magic vars 25052 1726882482.56749: starting attempt loop 25052 1726882482.56751: running the handler 25052 1726882482.56827: variable 'lsr_net_profile_exists' from source: set_fact 25052 1726882482.56831: Evaluated conditional (lsr_net_profile_exists): True 25052 1726882482.56837: handler run complete 25052 1726882482.56848: attempt loop complete, returning result 25052 1726882482.56850: _execute() done 25052 1726882482.56853: dumping result to json 25052 1726882482.56856: done dumping result, returning 25052 1726882482.56862: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'veth0' [12673a56-9f93-f7f6-4a6d-0000000003b9] 25052 1726882482.56865: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003b9 25052 1726882482.56945: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003b9 25052 1726882482.56948: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882482.56996: no more pending results, returning what we have 25052 1726882482.56999: results queue empty 25052 1726882482.56999: checking for any_errors_fatal 25052 1726882482.57004: done checking for any_errors_fatal 25052 1726882482.57005: checking for max_fail_percentage 25052 1726882482.57006: done checking for max_fail_percentage 25052 1726882482.57007: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.57008: done checking to see if all hosts have failed 25052 1726882482.57009: getting the remaining hosts for this loop 25052 1726882482.57010: done getting the remaining hosts for this loop 25052 1726882482.57012: getting the next task for host managed_node2 25052 1726882482.57018: done getting next task for host managed_node2 25052 1726882482.57020: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 25052 1726882482.57023: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.57026: getting variables 25052 1726882482.57028: in VariableManager get_vars() 25052 1726882482.57061: Calling all_inventory to load vars for managed_node2 25052 1726882482.57063: Calling groups_inventory to load vars for managed_node2 25052 1726882482.57066: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.57074: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.57076: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.57078: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.57895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.58736: done with get_vars() 25052 1726882482.58750: done getting variables 25052 1726882482.58786: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.58858: variable 'profile' from source: include params 25052 1726882482.58861: variable 'interface' from source: play vars 25052 1726882482.58899: variable 'interface' from source: play vars TASK [Assert that the ansible managed comment is present in 'veth0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:34:42 -0400 (0:00:00.035) 0:00:19.544 ****** 25052 1726882482.58923: entering _queue_task() for managed_node2/assert 25052 1726882482.59108: worker is 1 (out of 1 available) 25052 1726882482.59120: exiting _queue_task() for managed_node2/assert 25052 1726882482.59132: done queuing things up, now waiting for results queue to drain 25052 1726882482.59133: waiting for pending results... 25052 1726882482.59289: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'veth0' 25052 1726882482.59354: in run() - task 12673a56-9f93-f7f6-4a6d-0000000003ba 25052 1726882482.59364: variable 'ansible_search_path' from source: unknown 25052 1726882482.59367: variable 'ansible_search_path' from source: unknown 25052 1726882482.59396: calling self._execute() 25052 1726882482.59465: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.59469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.59476: variable 'omit' from source: magic vars 25052 1726882482.59720: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.59730: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.59736: variable 'omit' from source: magic vars 25052 1726882482.59762: variable 'omit' from source: magic vars 25052 1726882482.59834: variable 'profile' from source: include params 25052 1726882482.59838: variable 'interface' from source: play vars 25052 1726882482.59882: variable 'interface' from source: play vars 25052 1726882482.59903: variable 'omit' from source: magic vars 25052 1726882482.59935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.59960: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.59976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.59988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.60002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.60031: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.60034: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.60036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.60104: Set connection var ansible_pipelining to False 25052 1726882482.60107: Set connection var ansible_connection to ssh 25052 1726882482.60110: Set connection var ansible_shell_type to sh 25052 1726882482.60115: Set connection var ansible_timeout to 10 25052 1726882482.60124: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.60129: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.60145: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.60148: variable 'ansible_connection' from source: unknown 25052 1726882482.60151: variable 'ansible_module_compression' from source: unknown 25052 1726882482.60153: variable 'ansible_shell_type' from source: unknown 25052 1726882482.60156: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.60158: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.60162: variable 'ansible_pipelining' from source: unknown 25052 1726882482.60164: variable 'ansible_timeout' from source: unknown 25052 1726882482.60168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.60268: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.60277: variable 'omit' from source: magic vars 25052 1726882482.60281: starting attempt loop 25052 1726882482.60285: running the handler 25052 1726882482.60362: variable 'lsr_net_profile_ansible_managed' from source: set_fact 25052 1726882482.60365: Evaluated conditional (lsr_net_profile_ansible_managed): True 25052 1726882482.60371: handler run complete 25052 1726882482.60382: attempt loop complete, returning result 25052 1726882482.60384: _execute() done 25052 1726882482.60387: dumping result to json 25052 1726882482.60389: done dumping result, returning 25052 1726882482.60400: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'veth0' [12673a56-9f93-f7f6-4a6d-0000000003ba] 25052 1726882482.60404: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003ba 25052 1726882482.60479: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003ba 25052 1726882482.60482: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882482.60528: no more pending results, returning what we have 25052 1726882482.60531: results queue empty 25052 1726882482.60532: checking for any_errors_fatal 25052 1726882482.60539: done checking for any_errors_fatal 25052 1726882482.60539: checking for max_fail_percentage 25052 1726882482.60541: done checking for max_fail_percentage 25052 1726882482.60542: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.60543: done checking to see if all hosts have failed 25052 1726882482.60544: getting the remaining hosts for this loop 25052 1726882482.60545: done getting the remaining hosts for this loop 25052 1726882482.60547: getting the next task for host managed_node2 25052 1726882482.60554: done getting next task for host managed_node2 25052 1726882482.60556: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 25052 1726882482.60559: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.60562: getting variables 25052 1726882482.60564: in VariableManager get_vars() 25052 1726882482.60608: Calling all_inventory to load vars for managed_node2 25052 1726882482.60611: Calling groups_inventory to load vars for managed_node2 25052 1726882482.60614: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.60623: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.60625: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.60627: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.61395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.62334: done with get_vars() 25052 1726882482.62348: done getting variables 25052 1726882482.62386: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882482.62462: variable 'profile' from source: include params 25052 1726882482.62465: variable 'interface' from source: play vars 25052 1726882482.62506: variable 'interface' from source: play vars TASK [Assert that the fingerprint comment is present in veth0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:34:42 -0400 (0:00:00.036) 0:00:19.580 ****** 25052 1726882482.62531: entering _queue_task() for managed_node2/assert 25052 1726882482.62732: worker is 1 (out of 1 available) 25052 1726882482.62745: exiting _queue_task() for managed_node2/assert 25052 1726882482.62756: done queuing things up, now waiting for results queue to drain 25052 1726882482.62757: waiting for pending results... 25052 1726882482.62924: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in veth0 25052 1726882482.62988: in run() - task 12673a56-9f93-f7f6-4a6d-0000000003bb 25052 1726882482.63000: variable 'ansible_search_path' from source: unknown 25052 1726882482.63003: variable 'ansible_search_path' from source: unknown 25052 1726882482.63029: calling self._execute() 25052 1726882482.63100: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.63107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.63115: variable 'omit' from source: magic vars 25052 1726882482.63363: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.63373: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.63379: variable 'omit' from source: magic vars 25052 1726882482.63407: variable 'omit' from source: magic vars 25052 1726882482.63475: variable 'profile' from source: include params 25052 1726882482.63479: variable 'interface' from source: play vars 25052 1726882482.63529: variable 'interface' from source: play vars 25052 1726882482.63542: variable 'omit' from source: magic vars 25052 1726882482.63571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.63599: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.63613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.63627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.63638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.63667: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.63670: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.63672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.63745: Set connection var ansible_pipelining to False 25052 1726882482.63748: Set connection var ansible_connection to ssh 25052 1726882482.63752: Set connection var ansible_shell_type to sh 25052 1726882482.63754: Set connection var ansible_timeout to 10 25052 1726882482.63759: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.63762: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.63778: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.63781: variable 'ansible_connection' from source: unknown 25052 1726882482.63783: variable 'ansible_module_compression' from source: unknown 25052 1726882482.63785: variable 'ansible_shell_type' from source: unknown 25052 1726882482.63788: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.63790: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.63796: variable 'ansible_pipelining' from source: unknown 25052 1726882482.63799: variable 'ansible_timeout' from source: unknown 25052 1726882482.63801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.63898: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.63905: variable 'omit' from source: magic vars 25052 1726882482.63910: starting attempt loop 25052 1726882482.63913: running the handler 25052 1726882482.63986: variable 'lsr_net_profile_fingerprint' from source: set_fact 25052 1726882482.63989: Evaluated conditional (lsr_net_profile_fingerprint): True 25052 1726882482.63998: handler run complete 25052 1726882482.64008: attempt loop complete, returning result 25052 1726882482.64011: _execute() done 25052 1726882482.64013: dumping result to json 25052 1726882482.64016: done dumping result, returning 25052 1726882482.64022: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in veth0 [12673a56-9f93-f7f6-4a6d-0000000003bb] 25052 1726882482.64026: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003bb 25052 1726882482.64102: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000003bb 25052 1726882482.64105: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882482.64148: no more pending results, returning what we have 25052 1726882482.64152: results queue empty 25052 1726882482.64153: checking for any_errors_fatal 25052 1726882482.64157: done checking for any_errors_fatal 25052 1726882482.64158: checking for max_fail_percentage 25052 1726882482.64159: done checking for max_fail_percentage 25052 1726882482.64160: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.64161: done checking to see if all hosts have failed 25052 1726882482.64162: getting the remaining hosts for this loop 25052 1726882482.64163: done getting the remaining hosts for this loop 25052 1726882482.64165: getting the next task for host managed_node2 25052 1726882482.64172: done getting next task for host managed_node2 25052 1726882482.64174: ^ task is: TASK: Get ip address information 25052 1726882482.64176: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.64180: getting variables 25052 1726882482.64181: in VariableManager get_vars() 25052 1726882482.64218: Calling all_inventory to load vars for managed_node2 25052 1726882482.64221: Calling groups_inventory to load vars for managed_node2 25052 1726882482.64223: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.64232: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.64234: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.64236: Calling groups_plugins_play to load vars for managed_node2 25052 1726882482.64963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882482.65811: done with get_vars() 25052 1726882482.65827: done getting variables 25052 1726882482.65864: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ip address information] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:53 Friday 20 September 2024 21:34:42 -0400 (0:00:00.033) 0:00:19.613 ****** 25052 1726882482.65882: entering _queue_task() for managed_node2/command 25052 1726882482.66074: worker is 1 (out of 1 available) 25052 1726882482.66087: exiting _queue_task() for managed_node2/command 25052 1726882482.66099: done queuing things up, now waiting for results queue to drain 25052 1726882482.66100: waiting for pending results... 25052 1726882482.66272: running TaskExecutor() for managed_node2/TASK: Get ip address information 25052 1726882482.66336: in run() - task 12673a56-9f93-f7f6-4a6d-00000000005e 25052 1726882482.66347: variable 'ansible_search_path' from source: unknown 25052 1726882482.66373: calling self._execute() 25052 1726882482.66449: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.66452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.66461: variable 'omit' from source: magic vars 25052 1726882482.66716: variable 'ansible_distribution_major_version' from source: facts 25052 1726882482.66725: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882482.66732: variable 'omit' from source: magic vars 25052 1726882482.66747: variable 'omit' from source: magic vars 25052 1726882482.66817: variable 'interface' from source: play vars 25052 1726882482.66833: variable 'omit' from source: magic vars 25052 1726882482.66867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882482.66892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882482.66911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882482.66924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.66934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882482.66956: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882482.66960: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.66962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.67036: Set connection var ansible_pipelining to False 25052 1726882482.67039: Set connection var ansible_connection to ssh 25052 1726882482.67041: Set connection var ansible_shell_type to sh 25052 1726882482.67047: Set connection var ansible_timeout to 10 25052 1726882482.67053: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882482.67058: Set connection var ansible_shell_executable to /bin/sh 25052 1726882482.67073: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.67077: variable 'ansible_connection' from source: unknown 25052 1726882482.67080: variable 'ansible_module_compression' from source: unknown 25052 1726882482.67083: variable 'ansible_shell_type' from source: unknown 25052 1726882482.67086: variable 'ansible_shell_executable' from source: unknown 25052 1726882482.67088: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882482.67091: variable 'ansible_pipelining' from source: unknown 25052 1726882482.67100: variable 'ansible_timeout' from source: unknown 25052 1726882482.67106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882482.67191: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882482.67208: variable 'omit' from source: magic vars 25052 1726882482.67217: starting attempt loop 25052 1726882482.67220: running the handler 25052 1726882482.67223: _low_level_execute_command(): starting 25052 1726882482.67231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882482.67735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.67738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.67741: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.67743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.67800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.67804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.67806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.67875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.69518: stdout chunk (state=3): >>>/root <<< 25052 1726882482.69617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.69644: stderr chunk (state=3): >>><<< 25052 1726882482.69647: stdout chunk (state=3): >>><<< 25052 1726882482.69666: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.69676: _low_level_execute_command(): starting 25052 1726882482.69682: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920 `" && echo ansible-tmp-1726882482.6966512-25966-227100709818920="` echo /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920 `" ) && sleep 0' 25052 1726882482.70122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.70125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882482.70127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.70136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.70138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.70184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.70187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.70195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.70258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.72123: stdout chunk (state=3): >>>ansible-tmp-1726882482.6966512-25966-227100709818920=/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920 <<< 25052 1726882482.72230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.72254: stderr chunk (state=3): >>><<< 25052 1726882482.72258: stdout chunk (state=3): >>><<< 25052 1726882482.72271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882482.6966512-25966-227100709818920=/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.72301: variable 'ansible_module_compression' from source: unknown 25052 1726882482.72337: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882482.72362: variable 'ansible_facts' from source: unknown 25052 1726882482.72417: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py 25052 1726882482.72509: Sending initial data 25052 1726882482.72513: Sent initial data (156 bytes) 25052 1726882482.72939: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.72942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882482.72944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.72947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.72951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.73000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.73009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.73072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.74595: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 25052 1726882482.74599: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882482.74651: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882482.74717: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp3362sfhx /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py <<< 25052 1726882482.74720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py" <<< 25052 1726882482.74776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp3362sfhx" to remote "/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py" <<< 25052 1726882482.74781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py" <<< 25052 1726882482.75383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.75421: stderr chunk (state=3): >>><<< 25052 1726882482.75424: stdout chunk (state=3): >>><<< 25052 1726882482.75463: done transferring module to remote 25052 1726882482.75471: _low_level_execute_command(): starting 25052 1726882482.75475: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/ /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py && sleep 0' 25052 1726882482.75879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882482.75919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882482.75922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.75924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.75930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882482.75932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.75970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882482.75973: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.76042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.77762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.77784: stderr chunk (state=3): >>><<< 25052 1726882482.77787: stdout chunk (state=3): >>><<< 25052 1726882482.77803: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.77808: _low_level_execute_command(): starting 25052 1726882482.77811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/AnsiballZ_command.py && sleep 0' 25052 1726882482.78213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.78216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882482.78218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.78220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882482.78269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.78272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.78340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.93582: stdout chunk (state=3): >>> {"changed": true, "stdout": "22: veth0@if21: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 42:10:dd:bb:83:3d brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::4010:ddff:febb:833d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:34:42.931279", "end": "2024-09-20 21:34:42.934758", "delta": "0:00:00.003479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882482.95100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882482.95104: stdout chunk (state=3): >>><<< 25052 1726882482.95107: stderr chunk (state=3): >>><<< 25052 1726882482.95109: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "22: veth0@if21: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 42:10:dd:bb:83:3d brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::4010:ddff:febb:833d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "addr", "show", "veth0"], "start": "2024-09-20 21:34:42.931279", "end": "2024-09-20 21:34:42.934758", "delta": "0:00:00.003479", "msg": "", "invocation": {"module_args": {"_raw_params": "ip addr show veth0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882482.95112: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip addr show veth0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882482.95121: _low_level_execute_command(): starting 25052 1726882482.95123: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882482.6966512-25966-227100709818920/ > /dev/null 2>&1 && sleep 0' 25052 1726882482.95786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882482.95790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882482.95816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882482.95841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882482.95931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882482.97764: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882482.97772: stdout chunk (state=3): >>><<< 25052 1726882482.97781: stderr chunk (state=3): >>><<< 25052 1726882482.97805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882482.97821: handler run complete 25052 1726882482.97849: Evaluated conditional (False): False 25052 1726882482.97863: attempt loop complete, returning result 25052 1726882482.97871: _execute() done 25052 1726882482.97999: dumping result to json 25052 1726882482.98002: done dumping result, returning 25052 1726882482.98005: done running TaskExecutor() for managed_node2/TASK: Get ip address information [12673a56-9f93-f7f6-4a6d-00000000005e] 25052 1726882482.98007: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005e 25052 1726882482.98081: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005e 25052 1726882482.98084: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "addr", "show", "veth0" ], "delta": "0:00:00.003479", "end": "2024-09-20 21:34:42.934758", "rc": 0, "start": "2024-09-20 21:34:42.931279" } STDOUT: 22: veth0@if21: mtu 1500 qdisc noqueue state UP group default qlen 1000 link/ether 42:10:dd:bb:83:3d brd ff:ff:ff:ff:ff:ff link-netns ns1 inet6 2001:db8::2/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::3/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 2001:db8::4/32 scope global noprefixroute valid_lft forever preferred_lft forever inet6 fe80::4010:ddff:febb:833d/64 scope link noprefixroute valid_lft forever preferred_lft forever 25052 1726882482.98175: no more pending results, returning what we have 25052 1726882482.98180: results queue empty 25052 1726882482.98181: checking for any_errors_fatal 25052 1726882482.98188: done checking for any_errors_fatal 25052 1726882482.98188: checking for max_fail_percentage 25052 1726882482.98190: done checking for max_fail_percentage 25052 1726882482.98201: checking to see if all hosts have failed and the running result is not ok 25052 1726882482.98202: done checking to see if all hosts have failed 25052 1726882482.98203: getting the remaining hosts for this loop 25052 1726882482.98204: done getting the remaining hosts for this loop 25052 1726882482.98208: getting the next task for host managed_node2 25052 1726882482.98215: done getting next task for host managed_node2 25052 1726882482.98218: ^ task is: TASK: Show ip_addr 25052 1726882482.98220: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882482.98225: getting variables 25052 1726882482.98227: in VariableManager get_vars() 25052 1726882482.98271: Calling all_inventory to load vars for managed_node2 25052 1726882482.98274: Calling groups_inventory to load vars for managed_node2 25052 1726882482.98277: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882482.98289: Calling all_plugins_play to load vars for managed_node2 25052 1726882482.98404: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882482.98410: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.05000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.06573: done with get_vars() 25052 1726882483.06604: done getting variables 25052 1726882483.06653: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ip_addr] ************************************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:57 Friday 20 September 2024 21:34:43 -0400 (0:00:00.407) 0:00:20.021 ****** 25052 1726882483.06683: entering _queue_task() for managed_node2/debug 25052 1726882483.07054: worker is 1 (out of 1 available) 25052 1726882483.07066: exiting _queue_task() for managed_node2/debug 25052 1726882483.07078: done queuing things up, now waiting for results queue to drain 25052 1726882483.07080: waiting for pending results... 25052 1726882483.07517: running TaskExecutor() for managed_node2/TASK: Show ip_addr 25052 1726882483.07522: in run() - task 12673a56-9f93-f7f6-4a6d-00000000005f 25052 1726882483.07525: variable 'ansible_search_path' from source: unknown 25052 1726882483.07539: calling self._execute() 25052 1726882483.07654: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.07669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.07718: variable 'omit' from source: magic vars 25052 1726882483.08087: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.08110: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.08123: variable 'omit' from source: magic vars 25052 1726882483.08152: variable 'omit' from source: magic vars 25052 1726882483.08261: variable 'omit' from source: magic vars 25052 1726882483.08264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.08301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.08326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.08348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.08369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.08414: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.08424: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.08432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.08549: Set connection var ansible_pipelining to False 25052 1726882483.08558: Set connection var ansible_connection to ssh 25052 1726882483.08587: Set connection var ansible_shell_type to sh 25052 1726882483.08590: Set connection var ansible_timeout to 10 25052 1726882483.08600: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.08610: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.08701: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.08705: variable 'ansible_connection' from source: unknown 25052 1726882483.08708: variable 'ansible_module_compression' from source: unknown 25052 1726882483.08710: variable 'ansible_shell_type' from source: unknown 25052 1726882483.08713: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.08715: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.08717: variable 'ansible_pipelining' from source: unknown 25052 1726882483.08719: variable 'ansible_timeout' from source: unknown 25052 1726882483.08721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.08899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.08902: variable 'omit' from source: magic vars 25052 1726882483.08905: starting attempt loop 25052 1726882483.08908: running the handler 25052 1726882483.09005: variable 'ip_addr' from source: set_fact 25052 1726882483.09031: handler run complete 25052 1726882483.09056: attempt loop complete, returning result 25052 1726882483.09064: _execute() done 25052 1726882483.09070: dumping result to json 25052 1726882483.09077: done dumping result, returning 25052 1726882483.09089: done running TaskExecutor() for managed_node2/TASK: Show ip_addr [12673a56-9f93-f7f6-4a6d-00000000005f] 25052 1726882483.09103: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005f ok: [managed_node2] => { "ip_addr.stdout": "22: veth0@if21: mtu 1500 qdisc noqueue state UP group default qlen 1000\n link/ether 42:10:dd:bb:83:3d brd ff:ff:ff:ff:ff:ff link-netns ns1\n inet6 2001:db8::2/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::3/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 2001:db8::4/32 scope global noprefixroute \n valid_lft forever preferred_lft forever\n inet6 fe80::4010:ddff:febb:833d/64 scope link noprefixroute \n valid_lft forever preferred_lft forever" } 25052 1726882483.09333: no more pending results, returning what we have 25052 1726882483.09337: results queue empty 25052 1726882483.09338: checking for any_errors_fatal 25052 1726882483.09346: done checking for any_errors_fatal 25052 1726882483.09347: checking for max_fail_percentage 25052 1726882483.09349: done checking for max_fail_percentage 25052 1726882483.09350: checking to see if all hosts have failed and the running result is not ok 25052 1726882483.09350: done checking to see if all hosts have failed 25052 1726882483.09351: getting the remaining hosts for this loop 25052 1726882483.09353: done getting the remaining hosts for this loop 25052 1726882483.09356: getting the next task for host managed_node2 25052 1726882483.09362: done getting next task for host managed_node2 25052 1726882483.09366: ^ task is: TASK: Assert ipv6 addresses are correctly set 25052 1726882483.09368: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882483.09372: getting variables 25052 1726882483.09374: in VariableManager get_vars() 25052 1726882483.09420: Calling all_inventory to load vars for managed_node2 25052 1726882483.09424: Calling groups_inventory to load vars for managed_node2 25052 1726882483.09426: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882483.09437: Calling all_plugins_play to load vars for managed_node2 25052 1726882483.09440: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882483.09443: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.10007: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000005f 25052 1726882483.10010: WORKER PROCESS EXITING 25052 1726882483.11049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.12677: done with get_vars() 25052 1726882483.12707: done getting variables 25052 1726882483.12764: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert ipv6 addresses are correctly set] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:60 Friday 20 September 2024 21:34:43 -0400 (0:00:00.061) 0:00:20.082 ****** 25052 1726882483.12795: entering _queue_task() for managed_node2/assert 25052 1726882483.13224: worker is 1 (out of 1 available) 25052 1726882483.13236: exiting _queue_task() for managed_node2/assert 25052 1726882483.13246: done queuing things up, now waiting for results queue to drain 25052 1726882483.13247: waiting for pending results... 25052 1726882483.13426: running TaskExecutor() for managed_node2/TASK: Assert ipv6 addresses are correctly set 25052 1726882483.13530: in run() - task 12673a56-9f93-f7f6-4a6d-000000000060 25052 1726882483.13551: variable 'ansible_search_path' from source: unknown 25052 1726882483.13599: calling self._execute() 25052 1726882483.13715: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.13728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.13749: variable 'omit' from source: magic vars 25052 1726882483.14147: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.14164: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.14180: variable 'omit' from source: magic vars 25052 1726882483.14212: variable 'omit' from source: magic vars 25052 1726882483.14258: variable 'omit' from source: magic vars 25052 1726882483.14342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.14355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.14380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.14411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.14429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.14467: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.14508: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.14511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.14581: Set connection var ansible_pipelining to False 25052 1726882483.14587: Set connection var ansible_connection to ssh 25052 1726882483.14596: Set connection var ansible_shell_type to sh 25052 1726882483.14609: Set connection var ansible_timeout to 10 25052 1726882483.14667: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.14670: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.14672: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.14674: variable 'ansible_connection' from source: unknown 25052 1726882483.14676: variable 'ansible_module_compression' from source: unknown 25052 1726882483.14679: variable 'ansible_shell_type' from source: unknown 25052 1726882483.14681: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.14682: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.14696: variable 'ansible_pipelining' from source: unknown 25052 1726882483.14726: variable 'ansible_timeout' from source: unknown 25052 1726882483.14729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.14858: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.14871: variable 'omit' from source: magic vars 25052 1726882483.14907: starting attempt loop 25052 1726882483.14910: running the handler 25052 1726882483.15041: variable 'ip_addr' from source: set_fact 25052 1726882483.15065: Evaluated conditional ('inet6 2001:db8::2/32' in ip_addr.stdout): True 25052 1726882483.15199: variable 'ip_addr' from source: set_fact 25052 1726882483.15233: Evaluated conditional ('inet6 2001:db8::3/32' in ip_addr.stdout): True 25052 1726882483.15348: variable 'ip_addr' from source: set_fact 25052 1726882483.15362: Evaluated conditional ('inet6 2001:db8::4/32' in ip_addr.stdout): True 25052 1726882483.15447: handler run complete 25052 1726882483.15451: attempt loop complete, returning result 25052 1726882483.15454: _execute() done 25052 1726882483.15457: dumping result to json 25052 1726882483.15459: done dumping result, returning 25052 1726882483.15461: done running TaskExecutor() for managed_node2/TASK: Assert ipv6 addresses are correctly set [12673a56-9f93-f7f6-4a6d-000000000060] 25052 1726882483.15463: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000060 25052 1726882483.15538: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000060 25052 1726882483.15541: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882483.15801: no more pending results, returning what we have 25052 1726882483.15804: results queue empty 25052 1726882483.15805: checking for any_errors_fatal 25052 1726882483.15810: done checking for any_errors_fatal 25052 1726882483.15811: checking for max_fail_percentage 25052 1726882483.15812: done checking for max_fail_percentage 25052 1726882483.15813: checking to see if all hosts have failed and the running result is not ok 25052 1726882483.15814: done checking to see if all hosts have failed 25052 1726882483.15815: getting the remaining hosts for this loop 25052 1726882483.15816: done getting the remaining hosts for this loop 25052 1726882483.15819: getting the next task for host managed_node2 25052 1726882483.15825: done getting next task for host managed_node2 25052 1726882483.15827: ^ task is: TASK: Get ipv6 routes 25052 1726882483.15829: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882483.15832: getting variables 25052 1726882483.15834: in VariableManager get_vars() 25052 1726882483.15871: Calling all_inventory to load vars for managed_node2 25052 1726882483.15874: Calling groups_inventory to load vars for managed_node2 25052 1726882483.15877: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882483.15887: Calling all_plugins_play to load vars for managed_node2 25052 1726882483.15890: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882483.15897: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.17575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.19176: done with get_vars() 25052 1726882483.19200: done getting variables 25052 1726882483.19256: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get ipv6 routes] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:69 Friday 20 September 2024 21:34:43 -0400 (0:00:00.064) 0:00:20.147 ****** 25052 1726882483.19288: entering _queue_task() for managed_node2/command 25052 1726882483.19719: worker is 1 (out of 1 available) 25052 1726882483.19729: exiting _queue_task() for managed_node2/command 25052 1726882483.19739: done queuing things up, now waiting for results queue to drain 25052 1726882483.19740: waiting for pending results... 25052 1726882483.20013: running TaskExecutor() for managed_node2/TASK: Get ipv6 routes 25052 1726882483.20134: in run() - task 12673a56-9f93-f7f6-4a6d-000000000061 25052 1726882483.20139: variable 'ansible_search_path' from source: unknown 25052 1726882483.20142: calling self._execute() 25052 1726882483.20195: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.20209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.20224: variable 'omit' from source: magic vars 25052 1726882483.20630: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.20648: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.20660: variable 'omit' from source: magic vars 25052 1726882483.20696: variable 'omit' from source: magic vars 25052 1726882483.20738: variable 'omit' from source: magic vars 25052 1726882483.20799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.20914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.20917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.20919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.20921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.20947: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.20957: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.20965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.21088: Set connection var ansible_pipelining to False 25052 1726882483.21101: Set connection var ansible_connection to ssh 25052 1726882483.21113: Set connection var ansible_shell_type to sh 25052 1726882483.21132: Set connection var ansible_timeout to 10 25052 1726882483.21145: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.21154: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.21181: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.21189: variable 'ansible_connection' from source: unknown 25052 1726882483.21203: variable 'ansible_module_compression' from source: unknown 25052 1726882483.21216: variable 'ansible_shell_type' from source: unknown 25052 1726882483.21238: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.21241: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.21243: variable 'ansible_pipelining' from source: unknown 25052 1726882483.21245: variable 'ansible_timeout' from source: unknown 25052 1726882483.21326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.21403: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.21420: variable 'omit' from source: magic vars 25052 1726882483.21435: starting attempt loop 25052 1726882483.21442: running the handler 25052 1726882483.21467: _low_level_execute_command(): starting 25052 1726882483.21479: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882483.22200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.22227: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.22315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.22346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.22362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.22385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.22498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.24101: stdout chunk (state=3): >>>/root <<< 25052 1726882483.24219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.24271: stderr chunk (state=3): >>><<< 25052 1726882483.24274: stdout chunk (state=3): >>><<< 25052 1726882483.24382: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.24386: _low_level_execute_command(): starting 25052 1726882483.24390: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751 `" && echo ansible-tmp-1726882483.2429967-25981-167330800017751="` echo /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751 `" ) && sleep 0' 25052 1726882483.24931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.24947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.24962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.24987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.25060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.25150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.25222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.25503: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.27185: stdout chunk (state=3): >>>ansible-tmp-1726882483.2429967-25981-167330800017751=/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751 <<< 25052 1726882483.27316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.27374: stderr chunk (state=3): >>><<< 25052 1726882483.27377: stdout chunk (state=3): >>><<< 25052 1726882483.27599: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882483.2429967-25981-167330800017751=/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.27602: variable 'ansible_module_compression' from source: unknown 25052 1726882483.27605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882483.27607: variable 'ansible_facts' from source: unknown 25052 1726882483.27633: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py 25052 1726882483.27855: Sending initial data 25052 1726882483.27858: Sent initial data (156 bytes) 25052 1726882483.28422: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.28499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.28552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.28568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.28599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.28730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.30313: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882483.30332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882483.30404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp5bvfisvv /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py <<< 25052 1726882483.30415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py" <<< 25052 1726882483.30490: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp5bvfisvv" to remote "/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py" <<< 25052 1726882483.32165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.32291: stderr chunk (state=3): >>><<< 25052 1726882483.32298: stdout chunk (state=3): >>><<< 25052 1726882483.32352: done transferring module to remote 25052 1726882483.32370: _low_level_execute_command(): starting 25052 1726882483.32380: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/ /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py && sleep 0' 25052 1726882483.33269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.33272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.33275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.33309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.33327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.33340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.33499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.35263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.35272: stdout chunk (state=3): >>><<< 25052 1726882483.35274: stderr chunk (state=3): >>><<< 25052 1726882483.35290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.35296: _low_level_execute_command(): starting 25052 1726882483.35303: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/AnsiballZ_command.py && sleep 0' 25052 1726882483.36088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.36118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.36129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.36144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.36161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882483.36164: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882483.36267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.36270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882483.36272: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882483.36274: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882483.36276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.36278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.36376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.36599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.36763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.51923: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:34:43.515034", "end": "2024-09-20 21:34:43.518381", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882483.53305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882483.53329: stderr chunk (state=3): >>><<< 25052 1726882483.53332: stdout chunk (state=3): >>><<< 25052 1726882483.53350: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-20 21:34:43.515034", "end": "2024-09-20 21:34:43.518381", "delta": "0:00:00.003347", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882483.53378: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882483.53384: _low_level_execute_command(): starting 25052 1726882483.53389: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882483.2429967-25981-167330800017751/ > /dev/null 2>&1 && sleep 0' 25052 1726882483.53850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.53853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.53861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882483.53863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.53865: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.53907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.53910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.53977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.55762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.55789: stderr chunk (state=3): >>><<< 25052 1726882483.55797: stdout chunk (state=3): >>><<< 25052 1726882483.55808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.55814: handler run complete 25052 1726882483.55832: Evaluated conditional (False): False 25052 1726882483.55840: attempt loop complete, returning result 25052 1726882483.55843: _execute() done 25052 1726882483.55845: dumping result to json 25052 1726882483.55850: done dumping result, returning 25052 1726882483.55858: done running TaskExecutor() for managed_node2/TASK: Get ipv6 routes [12673a56-9f93-f7f6-4a6d-000000000061] 25052 1726882483.55862: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000061 25052 1726882483.55960: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000061 25052 1726882483.55963: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.003347", "end": "2024-09-20 21:34:43.518381", "rc": 0, "start": "2024-09-20 21:34:43.515034" } STDOUT: 2001:db8::/32 dev veth0 proto kernel metric 101 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev veth0 proto kernel metric 1024 pref medium default via 2001:db8::1 dev veth0 proto static metric 101 pref medium 25052 1726882483.56059: no more pending results, returning what we have 25052 1726882483.56063: results queue empty 25052 1726882483.56064: checking for any_errors_fatal 25052 1726882483.56070: done checking for any_errors_fatal 25052 1726882483.56070: checking for max_fail_percentage 25052 1726882483.56074: done checking for max_fail_percentage 25052 1726882483.56075: checking to see if all hosts have failed and the running result is not ok 25052 1726882483.56076: done checking to see if all hosts have failed 25052 1726882483.56076: getting the remaining hosts for this loop 25052 1726882483.56078: done getting the remaining hosts for this loop 25052 1726882483.56081: getting the next task for host managed_node2 25052 1726882483.56087: done getting next task for host managed_node2 25052 1726882483.56090: ^ task is: TASK: Show ipv6_route 25052 1726882483.56095: ^ state is: HOST STATE: block=3, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882483.56099: getting variables 25052 1726882483.56100: in VariableManager get_vars() 25052 1726882483.56137: Calling all_inventory to load vars for managed_node2 25052 1726882483.56139: Calling groups_inventory to load vars for managed_node2 25052 1726882483.56141: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882483.56150: Calling all_plugins_play to load vars for managed_node2 25052 1726882483.56153: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882483.56155: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.56961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.57839: done with get_vars() 25052 1726882483.57856: done getting variables 25052 1726882483.57903: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show ipv6_route] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:73 Friday 20 September 2024 21:34:43 -0400 (0:00:00.386) 0:00:20.534 ****** 25052 1726882483.57924: entering _queue_task() for managed_node2/debug 25052 1726882483.58167: worker is 1 (out of 1 available) 25052 1726882483.58180: exiting _queue_task() for managed_node2/debug 25052 1726882483.58196: done queuing things up, now waiting for results queue to drain 25052 1726882483.58198: waiting for pending results... 25052 1726882483.58366: running TaskExecutor() for managed_node2/TASK: Show ipv6_route 25052 1726882483.58428: in run() - task 12673a56-9f93-f7f6-4a6d-000000000062 25052 1726882483.58440: variable 'ansible_search_path' from source: unknown 25052 1726882483.58467: calling self._execute() 25052 1726882483.58546: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.58550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.58560: variable 'omit' from source: magic vars 25052 1726882483.58831: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.58840: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.58846: variable 'omit' from source: magic vars 25052 1726882483.58868: variable 'omit' from source: magic vars 25052 1726882483.58897: variable 'omit' from source: magic vars 25052 1726882483.58927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.58953: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.58971: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.58985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.58997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.59021: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.59024: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.59026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.59102: Set connection var ansible_pipelining to False 25052 1726882483.59105: Set connection var ansible_connection to ssh 25052 1726882483.59108: Set connection var ansible_shell_type to sh 25052 1726882483.59113: Set connection var ansible_timeout to 10 25052 1726882483.59119: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.59124: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.59140: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.59143: variable 'ansible_connection' from source: unknown 25052 1726882483.59145: variable 'ansible_module_compression' from source: unknown 25052 1726882483.59148: variable 'ansible_shell_type' from source: unknown 25052 1726882483.59150: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.59152: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.59156: variable 'ansible_pipelining' from source: unknown 25052 1726882483.59158: variable 'ansible_timeout' from source: unknown 25052 1726882483.59162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.59264: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.59273: variable 'omit' from source: magic vars 25052 1726882483.59276: starting attempt loop 25052 1726882483.59279: running the handler 25052 1726882483.59375: variable 'ipv6_route' from source: set_fact 25052 1726882483.59388: handler run complete 25052 1726882483.59407: attempt loop complete, returning result 25052 1726882483.59411: _execute() done 25052 1726882483.59414: dumping result to json 25052 1726882483.59416: done dumping result, returning 25052 1726882483.59419: done running TaskExecutor() for managed_node2/TASK: Show ipv6_route [12673a56-9f93-f7f6-4a6d-000000000062] 25052 1726882483.59421: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000062 25052 1726882483.59504: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000062 25052 1726882483.59507: WORKER PROCESS EXITING ok: [managed_node2] => { "ipv6_route.stdout": "2001:db8::/32 dev veth0 proto kernel metric 101 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev veth0 proto kernel metric 1024 pref medium\ndefault via 2001:db8::1 dev veth0 proto static metric 101 pref medium" } 25052 1726882483.59552: no more pending results, returning what we have 25052 1726882483.59556: results queue empty 25052 1726882483.59557: checking for any_errors_fatal 25052 1726882483.59567: done checking for any_errors_fatal 25052 1726882483.59568: checking for max_fail_percentage 25052 1726882483.59570: done checking for max_fail_percentage 25052 1726882483.59570: checking to see if all hosts have failed and the running result is not ok 25052 1726882483.59571: done checking to see if all hosts have failed 25052 1726882483.59572: getting the remaining hosts for this loop 25052 1726882483.59573: done getting the remaining hosts for this loop 25052 1726882483.59577: getting the next task for host managed_node2 25052 1726882483.59584: done getting next task for host managed_node2 25052 1726882483.59586: ^ task is: TASK: Assert default ipv6 route is set 25052 1726882483.59588: ^ state is: HOST STATE: block=3, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882483.59595: getting variables 25052 1726882483.59597: in VariableManager get_vars() 25052 1726882483.59635: Calling all_inventory to load vars for managed_node2 25052 1726882483.59638: Calling groups_inventory to load vars for managed_node2 25052 1726882483.59640: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882483.59649: Calling all_plugins_play to load vars for managed_node2 25052 1726882483.59651: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882483.59653: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.60527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.61405: done with get_vars() 25052 1726882483.61422: done getting variables 25052 1726882483.61467: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert default ipv6 route is set] **************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:76 Friday 20 September 2024 21:34:43 -0400 (0:00:00.035) 0:00:20.569 ****** 25052 1726882483.61490: entering _queue_task() for managed_node2/assert 25052 1726882483.61753: worker is 1 (out of 1 available) 25052 1726882483.61769: exiting _queue_task() for managed_node2/assert 25052 1726882483.61780: done queuing things up, now waiting for results queue to drain 25052 1726882483.61782: waiting for pending results... 25052 1726882483.61955: running TaskExecutor() for managed_node2/TASK: Assert default ipv6 route is set 25052 1726882483.62021: in run() - task 12673a56-9f93-f7f6-4a6d-000000000063 25052 1726882483.62031: variable 'ansible_search_path' from source: unknown 25052 1726882483.62060: calling self._execute() 25052 1726882483.62140: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.62144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.62152: variable 'omit' from source: magic vars 25052 1726882483.62431: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.62443: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.62454: variable 'omit' from source: magic vars 25052 1726882483.62468: variable 'omit' from source: magic vars 25052 1726882483.62498: variable 'omit' from source: magic vars 25052 1726882483.62528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.62556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.62573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.62585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.62597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.62622: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.62625: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.62628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.62699: Set connection var ansible_pipelining to False 25052 1726882483.62702: Set connection var ansible_connection to ssh 25052 1726882483.62705: Set connection var ansible_shell_type to sh 25052 1726882483.62711: Set connection var ansible_timeout to 10 25052 1726882483.62718: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.62722: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.62738: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.62740: variable 'ansible_connection' from source: unknown 25052 1726882483.62743: variable 'ansible_module_compression' from source: unknown 25052 1726882483.62746: variable 'ansible_shell_type' from source: unknown 25052 1726882483.62748: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.62750: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.62753: variable 'ansible_pipelining' from source: unknown 25052 1726882483.62755: variable 'ansible_timeout' from source: unknown 25052 1726882483.62760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.62861: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.62870: variable 'omit' from source: magic vars 25052 1726882483.62873: starting attempt loop 25052 1726882483.62876: running the handler 25052 1726882483.62983: variable '__test_str' from source: task vars 25052 1726882483.63038: variable 'interface' from source: play vars 25052 1726882483.63045: variable 'ipv6_route' from source: set_fact 25052 1726882483.63055: Evaluated conditional (__test_str in ipv6_route.stdout): True 25052 1726882483.63060: handler run complete 25052 1726882483.63070: attempt loop complete, returning result 25052 1726882483.63074: _execute() done 25052 1726882483.63076: dumping result to json 25052 1726882483.63078: done dumping result, returning 25052 1726882483.63084: done running TaskExecutor() for managed_node2/TASK: Assert default ipv6 route is set [12673a56-9f93-f7f6-4a6d-000000000063] 25052 1726882483.63088: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000063 25052 1726882483.63172: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000063 25052 1726882483.63175: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 25052 1726882483.63262: no more pending results, returning what we have 25052 1726882483.63266: results queue empty 25052 1726882483.63267: checking for any_errors_fatal 25052 1726882483.63272: done checking for any_errors_fatal 25052 1726882483.63273: checking for max_fail_percentage 25052 1726882483.63274: done checking for max_fail_percentage 25052 1726882483.63275: checking to see if all hosts have failed and the running result is not ok 25052 1726882483.63276: done checking to see if all hosts have failed 25052 1726882483.63277: getting the remaining hosts for this loop 25052 1726882483.63278: done getting the remaining hosts for this loop 25052 1726882483.63281: getting the next task for host managed_node2 25052 1726882483.63287: done getting next task for host managed_node2 25052 1726882483.63290: ^ task is: TASK: Ensure ping6 command is present 25052 1726882483.63295: ^ state is: HOST STATE: block=3, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882483.63298: getting variables 25052 1726882483.63300: in VariableManager get_vars() 25052 1726882483.63336: Calling all_inventory to load vars for managed_node2 25052 1726882483.63339: Calling groups_inventory to load vars for managed_node2 25052 1726882483.63341: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882483.63349: Calling all_plugins_play to load vars for managed_node2 25052 1726882483.63352: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882483.63354: Calling groups_plugins_play to load vars for managed_node2 25052 1726882483.64149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882483.65208: done with get_vars() 25052 1726882483.65229: done getting variables 25052 1726882483.65286: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure ping6 command is present] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 Friday 20 September 2024 21:34:43 -0400 (0:00:00.038) 0:00:20.608 ****** 25052 1726882483.65317: entering _queue_task() for managed_node2/package 25052 1726882483.65616: worker is 1 (out of 1 available) 25052 1726882483.65631: exiting _queue_task() for managed_node2/package 25052 1726882483.65641: done queuing things up, now waiting for results queue to drain 25052 1726882483.65642: waiting for pending results... 25052 1726882483.66053: running TaskExecutor() for managed_node2/TASK: Ensure ping6 command is present 25052 1726882483.66060: in run() - task 12673a56-9f93-f7f6-4a6d-000000000064 25052 1726882483.66063: variable 'ansible_search_path' from source: unknown 25052 1726882483.66066: calling self._execute() 25052 1726882483.66155: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.66159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.66169: variable 'omit' from source: magic vars 25052 1726882483.66450: variable 'ansible_distribution_major_version' from source: facts 25052 1726882483.66461: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882483.66466: variable 'omit' from source: magic vars 25052 1726882483.66485: variable 'omit' from source: magic vars 25052 1726882483.66621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882483.68501: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882483.68504: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882483.68506: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882483.68509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882483.68511: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882483.68513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882483.68516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882483.68518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882483.68520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882483.68522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882483.68567: variable '__network_is_ostree' from source: set_fact 25052 1726882483.68571: variable 'omit' from source: magic vars 25052 1726882483.68600: variable 'omit' from source: magic vars 25052 1726882483.68623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882483.68648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882483.68665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882483.68679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.68690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882483.68751: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882483.68755: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.68757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.68826: Set connection var ansible_pipelining to False 25052 1726882483.68829: Set connection var ansible_connection to ssh 25052 1726882483.68832: Set connection var ansible_shell_type to sh 25052 1726882483.68834: Set connection var ansible_timeout to 10 25052 1726882483.68935: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882483.68939: Set connection var ansible_shell_executable to /bin/sh 25052 1726882483.68941: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.68943: variable 'ansible_connection' from source: unknown 25052 1726882483.68946: variable 'ansible_module_compression' from source: unknown 25052 1726882483.68948: variable 'ansible_shell_type' from source: unknown 25052 1726882483.68950: variable 'ansible_shell_executable' from source: unknown 25052 1726882483.68952: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882483.68955: variable 'ansible_pipelining' from source: unknown 25052 1726882483.68957: variable 'ansible_timeout' from source: unknown 25052 1726882483.68959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882483.69052: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882483.69056: variable 'omit' from source: magic vars 25052 1726882483.69058: starting attempt loop 25052 1726882483.69061: running the handler 25052 1726882483.69063: variable 'ansible_facts' from source: unknown 25052 1726882483.69065: variable 'ansible_facts' from source: unknown 25052 1726882483.69067: _low_level_execute_command(): starting 25052 1726882483.69069: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882483.69685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.69699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.69711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.69724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.69736: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882483.69743: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882483.69799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.69802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882483.69805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882483.69807: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882483.69813: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.69816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.69819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.69820: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882483.69823: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882483.69824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.69879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.70020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.70089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.71682: stdout chunk (state=3): >>>/root <<< 25052 1726882483.71831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.71834: stdout chunk (state=3): >>><<< 25052 1726882483.71837: stderr chunk (state=3): >>><<< 25052 1726882483.71940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.71943: _low_level_execute_command(): starting 25052 1726882483.71947: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014 `" && echo ansible-tmp-1726882483.7185616-26013-65035326881014="` echo /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014 `" ) && sleep 0' 25052 1726882483.72610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.72624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.72646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.72748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.74613: stdout chunk (state=3): >>>ansible-tmp-1726882483.7185616-26013-65035326881014=/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014 <<< 25052 1726882483.74762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.74781: stdout chunk (state=3): >>><<< 25052 1726882483.74811: stderr chunk (state=3): >>><<< 25052 1726882483.74958: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882483.7185616-26013-65035326881014=/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.74961: variable 'ansible_module_compression' from source: unknown 25052 1726882483.74964: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25052 1726882483.74989: variable 'ansible_facts' from source: unknown 25052 1726882483.75122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py 25052 1726882483.75323: Sending initial data 25052 1726882483.75326: Sent initial data (151 bytes) 25052 1726882483.75909: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.75925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.75965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882483.75981: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882483.76068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882483.76087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.76103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.76196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.77765: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882483.77930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882483.77987: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpzi8g7k75 /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py <<< 25052 1726882483.77989: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py" <<< 25052 1726882483.78064: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 25052 1726882483.78079: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpzi8g7k75" to remote "/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py" <<< 25052 1726882483.79238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.79250: stdout chunk (state=3): >>><<< 25052 1726882483.79274: stderr chunk (state=3): >>><<< 25052 1726882483.79359: done transferring module to remote 25052 1726882483.79375: _low_level_execute_command(): starting 25052 1726882483.79385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/ /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py && sleep 0' 25052 1726882483.80065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.80080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882483.80095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882483.80207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882483.80235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.80257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.80357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882483.82201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882483.82218: stdout chunk (state=3): >>><<< 25052 1726882483.82220: stderr chunk (state=3): >>><<< 25052 1726882483.82237: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882483.82319: _low_level_execute_command(): starting 25052 1726882483.82324: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/AnsiballZ_dnf.py && sleep 0' 25052 1726882483.82885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882483.82999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882483.83023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882483.83129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.23052: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25052 1726882484.27009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.27074: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 25052 1726882484.27109: stdout chunk (state=3): >>><<< 25052 1726882484.27113: stderr chunk (state=3): >>><<< 25052 1726882484.27259: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iputils"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882484.27268: done with _execute_module (ansible.legacy.dnf, {'name': 'iputils', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882484.27271: _low_level_execute_command(): starting 25052 1726882484.27274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882483.7185616-26013-65035326881014/ > /dev/null 2>&1 && sleep 0' 25052 1726882484.27877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.27890: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.27954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882484.27970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.28060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882484.28081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.28100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.28200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.30198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.30202: stdout chunk (state=3): >>><<< 25052 1726882484.30204: stderr chunk (state=3): >>><<< 25052 1726882484.30207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882484.30209: handler run complete 25052 1726882484.30211: attempt loop complete, returning result 25052 1726882484.30213: _execute() done 25052 1726882484.30215: dumping result to json 25052 1726882484.30217: done dumping result, returning 25052 1726882484.30219: done running TaskExecutor() for managed_node2/TASK: Ensure ping6 command is present [12673a56-9f93-f7f6-4a6d-000000000064] 25052 1726882484.30221: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000064 25052 1726882484.30283: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000064 25052 1726882484.30286: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25052 1726882484.30372: no more pending results, returning what we have 25052 1726882484.30376: results queue empty 25052 1726882484.30377: checking for any_errors_fatal 25052 1726882484.30382: done checking for any_errors_fatal 25052 1726882484.30383: checking for max_fail_percentage 25052 1726882484.30385: done checking for max_fail_percentage 25052 1726882484.30386: checking to see if all hosts have failed and the running result is not ok 25052 1726882484.30387: done checking to see if all hosts have failed 25052 1726882484.30388: getting the remaining hosts for this loop 25052 1726882484.30389: done getting the remaining hosts for this loop 25052 1726882484.30398: getting the next task for host managed_node2 25052 1726882484.30409: done getting next task for host managed_node2 25052 1726882484.30412: ^ task is: TASK: Test gateway can be pinged 25052 1726882484.30414: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882484.30418: getting variables 25052 1726882484.30420: in VariableManager get_vars() 25052 1726882484.30460: Calling all_inventory to load vars for managed_node2 25052 1726882484.30463: Calling groups_inventory to load vars for managed_node2 25052 1726882484.30465: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882484.30476: Calling all_plugins_play to load vars for managed_node2 25052 1726882484.30479: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882484.30482: Calling groups_plugins_play to load vars for managed_node2 25052 1726882484.32115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882484.33754: done with get_vars() 25052 1726882484.33784: done getting variables 25052 1726882484.33850: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Test gateway can be pinged] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:86 Friday 20 September 2024 21:34:44 -0400 (0:00:00.685) 0:00:21.293 ****** 25052 1726882484.33885: entering _queue_task() for managed_node2/command 25052 1726882484.34259: worker is 1 (out of 1 available) 25052 1726882484.34271: exiting _queue_task() for managed_node2/command 25052 1726882484.34283: done queuing things up, now waiting for results queue to drain 25052 1726882484.34285: waiting for pending results... 25052 1726882484.34566: running TaskExecutor() for managed_node2/TASK: Test gateway can be pinged 25052 1726882484.34654: in run() - task 12673a56-9f93-f7f6-4a6d-000000000065 25052 1726882484.34667: variable 'ansible_search_path' from source: unknown 25052 1726882484.34705: calling self._execute() 25052 1726882484.34807: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.34812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.34829: variable 'omit' from source: magic vars 25052 1726882484.35180: variable 'ansible_distribution_major_version' from source: facts 25052 1726882484.35199: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882484.35202: variable 'omit' from source: magic vars 25052 1726882484.35217: variable 'omit' from source: magic vars 25052 1726882484.35252: variable 'omit' from source: magic vars 25052 1726882484.35298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882484.35329: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882484.35347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882484.35400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882484.35403: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882484.35414: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882484.35417: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.35420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.35514: Set connection var ansible_pipelining to False 25052 1726882484.35522: Set connection var ansible_connection to ssh 25052 1726882484.35525: Set connection var ansible_shell_type to sh 25052 1726882484.35528: Set connection var ansible_timeout to 10 25052 1726882484.35599: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882484.35602: Set connection var ansible_shell_executable to /bin/sh 25052 1726882484.35605: variable 'ansible_shell_executable' from source: unknown 25052 1726882484.35607: variable 'ansible_connection' from source: unknown 25052 1726882484.35609: variable 'ansible_module_compression' from source: unknown 25052 1726882484.35612: variable 'ansible_shell_type' from source: unknown 25052 1726882484.35614: variable 'ansible_shell_executable' from source: unknown 25052 1726882484.35616: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.35618: variable 'ansible_pipelining' from source: unknown 25052 1726882484.35621: variable 'ansible_timeout' from source: unknown 25052 1726882484.35623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.35727: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882484.35740: variable 'omit' from source: magic vars 25052 1726882484.35743: starting attempt loop 25052 1726882484.35746: running the handler 25052 1726882484.35757: _low_level_execute_command(): starting 25052 1726882484.35766: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882484.36518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.36522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.36524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.36547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.36553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.36560: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882484.36570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.36597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882484.36700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.36705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.36808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.38429: stdout chunk (state=3): >>>/root <<< 25052 1726882484.38499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.38700: stderr chunk (state=3): >>><<< 25052 1726882484.38703: stdout chunk (state=3): >>><<< 25052 1726882484.38707: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882484.38709: _low_level_execute_command(): starting 25052 1726882484.38712: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375 `" && echo ansible-tmp-1726882484.3856227-26041-152782701390375="` echo /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375 `" ) && sleep 0' 25052 1726882484.39155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.39164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.39174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.39187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.39205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.39212: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882484.39222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.39236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882484.39243: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882484.39250: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882484.39262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.39270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.39279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.39287: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.39297: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882484.39372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.39377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882484.39386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.39408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.39500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.41372: stdout chunk (state=3): >>>ansible-tmp-1726882484.3856227-26041-152782701390375=/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375 <<< 25052 1726882484.41519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.41522: stdout chunk (state=3): >>><<< 25052 1726882484.41524: stderr chunk (state=3): >>><<< 25052 1726882484.41628: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882484.3856227-26041-152782701390375=/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882484.41631: variable 'ansible_module_compression' from source: unknown 25052 1726882484.41634: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882484.41798: variable 'ansible_facts' from source: unknown 25052 1726882484.41801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py 25052 1726882484.41938: Sending initial data 25052 1726882484.41941: Sent initial data (156 bytes) 25052 1726882484.42547: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.42557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.42601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.42611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.42621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.42630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.42689: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.42718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882484.42729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.42747: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.42831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.44710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882484.44714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882484.44790: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpjke02d_l /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py <<< 25052 1726882484.44800: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py" <<< 25052 1726882484.44850: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpjke02d_l" to remote "/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py" <<< 25052 1726882484.46298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.46317: stderr chunk (state=3): >>><<< 25052 1726882484.46320: stdout chunk (state=3): >>><<< 25052 1726882484.46385: done transferring module to remote 25052 1726882484.46399: _low_level_execute_command(): starting 25052 1726882484.46465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/ /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py && sleep 0' 25052 1726882484.47498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.47502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882484.47520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.47526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.47567: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.47573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.47769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882484.47807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.47811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.47902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.49635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.49742: stderr chunk (state=3): >>><<< 25052 1726882484.49745: stdout chunk (state=3): >>><<< 25052 1726882484.49855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882484.49859: _low_level_execute_command(): starting 25052 1726882484.49863: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/AnsiballZ_command.py && sleep 0' 25052 1726882484.50939: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.51139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.51308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.51508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.66561: stdout chunk (state=3): >>> {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.051 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.051/0.051/0.051/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:34:44.660898", "end": "2024-09-20 21:34:44.664654", "delta": "0:00:00.003756", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882484.67984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882484.68031: stderr chunk (state=3): >>><<< 25052 1726882484.68034: stdout chunk (state=3): >>><<< 25052 1726882484.68053: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "PING 2001:db8::1 (2001:db8::1) 56 data bytes\n64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.051 ms\n\n--- 2001:db8::1 ping statistics ---\n1 packets transmitted, 1 received, 0% packet loss, time 0ms\nrtt min/avg/max/mdev = 0.051/0.051/0.051/0.000 ms", "stderr": "", "rc": 0, "cmd": ["ping6", "-c1", "2001:db8::1"], "start": "2024-09-20 21:34:44.660898", "end": "2024-09-20 21:34:44.664654", "delta": "0:00:00.003756", "msg": "", "invocation": {"module_args": {"_raw_params": "ping6 -c1 2001:db8::1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882484.68107: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ping6 -c1 2001:db8::1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882484.68115: _low_level_execute_command(): starting 25052 1726882484.68117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882484.3856227-26041-152782701390375/ > /dev/null 2>&1 && sleep 0' 25052 1726882484.68699: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882484.68703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.68705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.68708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.68897: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.68901: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882484.68904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.68909: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882484.68912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882484.68914: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882484.68916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882484.68918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882484.68920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882484.68922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882484.68924: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882484.68926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882484.68928: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882484.68930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882484.68932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882484.68974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882484.70831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882484.70870: stderr chunk (state=3): >>><<< 25052 1726882484.70874: stdout chunk (state=3): >>><<< 25052 1726882484.71042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882484.71045: handler run complete 25052 1726882484.71047: Evaluated conditional (False): False 25052 1726882484.71049: attempt loop complete, returning result 25052 1726882484.71051: _execute() done 25052 1726882484.71053: dumping result to json 25052 1726882484.71055: done dumping result, returning 25052 1726882484.71056: done running TaskExecutor() for managed_node2/TASK: Test gateway can be pinged [12673a56-9f93-f7f6-4a6d-000000000065] 25052 1726882484.71058: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000065 25052 1726882484.71700: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000065 ok: [managed_node2] => { "changed": false, "cmd": [ "ping6", "-c1", "2001:db8::1" ], "delta": "0:00:00.003756", "end": "2024-09-20 21:34:44.664654", "rc": 0, "start": "2024-09-20 21:34:44.660898" } STDOUT: PING 2001:db8::1 (2001:db8::1) 56 data bytes 64 bytes from 2001:db8::1: icmp_seq=1 ttl=64 time=0.051 ms --- 2001:db8::1 ping statistics --- 1 packets transmitted, 1 received, 0% packet loss, time 0ms rtt min/avg/max/mdev = 0.051/0.051/0.051/0.000 ms 25052 1726882484.71769: no more pending results, returning what we have 25052 1726882484.71772: results queue empty 25052 1726882484.71773: checking for any_errors_fatal 25052 1726882484.71781: done checking for any_errors_fatal 25052 1726882484.71782: checking for max_fail_percentage 25052 1726882484.71784: done checking for max_fail_percentage 25052 1726882484.71784: checking to see if all hosts have failed and the running result is not ok 25052 1726882484.71785: done checking to see if all hosts have failed 25052 1726882484.71786: getting the remaining hosts for this loop 25052 1726882484.71787: done getting the remaining hosts for this loop 25052 1726882484.71791: getting the next task for host managed_node2 25052 1726882484.71799: done getting next task for host managed_node2 25052 1726882484.71802: ^ task is: TASK: TEARDOWN: remove profiles. 25052 1726882484.71804: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882484.71807: getting variables 25052 1726882484.71808: in VariableManager get_vars() 25052 1726882484.71844: Calling all_inventory to load vars for managed_node2 25052 1726882484.71846: Calling groups_inventory to load vars for managed_node2 25052 1726882484.71848: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882484.71859: Calling all_plugins_play to load vars for managed_node2 25052 1726882484.71862: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882484.71867: Calling groups_plugins_play to load vars for managed_node2 25052 1726882484.72386: WORKER PROCESS EXITING 25052 1726882484.75185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882484.78660: done with get_vars() 25052 1726882484.78803: done getting variables 25052 1726882484.78866: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEARDOWN: remove profiles.] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:92 Friday 20 September 2024 21:34:44 -0400 (0:00:00.450) 0:00:21.743 ****** 25052 1726882484.79009: entering _queue_task() for managed_node2/debug 25052 1726882484.79642: worker is 1 (out of 1 available) 25052 1726882484.79767: exiting _queue_task() for managed_node2/debug 25052 1726882484.79779: done queuing things up, now waiting for results queue to drain 25052 1726882484.79780: waiting for pending results... 25052 1726882484.80155: running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. 25052 1726882484.80251: in run() - task 12673a56-9f93-f7f6-4a6d-000000000066 25052 1726882484.80270: variable 'ansible_search_path' from source: unknown 25052 1726882484.80700: calling self._execute() 25052 1726882484.80704: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.80707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.80710: variable 'omit' from source: magic vars 25052 1726882484.81402: variable 'ansible_distribution_major_version' from source: facts 25052 1726882484.81420: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882484.81431: variable 'omit' from source: magic vars 25052 1726882484.81457: variable 'omit' from source: magic vars 25052 1726882484.81898: variable 'omit' from source: magic vars 25052 1726882484.81902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882484.81905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882484.81907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882484.81910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882484.81912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882484.81914: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882484.81916: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.81918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.82178: Set connection var ansible_pipelining to False 25052 1726882484.82186: Set connection var ansible_connection to ssh 25052 1726882484.82198: Set connection var ansible_shell_type to sh 25052 1726882484.82213: Set connection var ansible_timeout to 10 25052 1726882484.82225: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882484.82235: Set connection var ansible_shell_executable to /bin/sh 25052 1726882484.82262: variable 'ansible_shell_executable' from source: unknown 25052 1726882484.82270: variable 'ansible_connection' from source: unknown 25052 1726882484.82279: variable 'ansible_module_compression' from source: unknown 25052 1726882484.82289: variable 'ansible_shell_type' from source: unknown 25052 1726882484.82301: variable 'ansible_shell_executable' from source: unknown 25052 1726882484.82309: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.82698: variable 'ansible_pipelining' from source: unknown 25052 1726882484.82701: variable 'ansible_timeout' from source: unknown 25052 1726882484.82704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.82707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882484.82710: variable 'omit' from source: magic vars 25052 1726882484.82717: starting attempt loop 25052 1726882484.82720: running the handler 25052 1726882484.82918: handler run complete 25052 1726882484.82942: attempt loop complete, returning result 25052 1726882484.82951: _execute() done 25052 1726882484.82958: dumping result to json 25052 1726882484.82965: done dumping result, returning 25052 1726882484.82976: done running TaskExecutor() for managed_node2/TASK: TEARDOWN: remove profiles. [12673a56-9f93-f7f6-4a6d-000000000066] 25052 1726882484.82986: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000066 ok: [managed_node2] => {} MSG: ################################################## 25052 1726882484.83126: no more pending results, returning what we have 25052 1726882484.83130: results queue empty 25052 1726882484.83131: checking for any_errors_fatal 25052 1726882484.83164: done checking for any_errors_fatal 25052 1726882484.83165: checking for max_fail_percentage 25052 1726882484.83167: done checking for max_fail_percentage 25052 1726882484.83168: checking to see if all hosts have failed and the running result is not ok 25052 1726882484.83168: done checking to see if all hosts have failed 25052 1726882484.83169: getting the remaining hosts for this loop 25052 1726882484.83170: done getting the remaining hosts for this loop 25052 1726882484.83173: getting the next task for host managed_node2 25052 1726882484.83183: done getting next task for host managed_node2 25052 1726882484.83187: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25052 1726882484.83190: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882484.83207: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000066 25052 1726882484.83211: WORKER PROCESS EXITING 25052 1726882484.83408: getting variables 25052 1726882484.83411: in VariableManager get_vars() 25052 1726882484.83449: Calling all_inventory to load vars for managed_node2 25052 1726882484.83452: Calling groups_inventory to load vars for managed_node2 25052 1726882484.83454: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882484.83463: Calling all_plugins_play to load vars for managed_node2 25052 1726882484.83465: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882484.83467: Calling groups_plugins_play to load vars for managed_node2 25052 1726882484.86212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882484.89508: done with get_vars() 25052 1726882484.89533: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:34:44 -0400 (0:00:00.108) 0:00:21.852 ****** 25052 1726882484.89743: entering _queue_task() for managed_node2/include_tasks 25052 1726882484.90551: worker is 1 (out of 1 available) 25052 1726882484.90565: exiting _queue_task() for managed_node2/include_tasks 25052 1726882484.90575: done queuing things up, now waiting for results queue to drain 25052 1726882484.90576: waiting for pending results... 25052 1726882484.90805: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 25052 1726882484.90945: in run() - task 12673a56-9f93-f7f6-4a6d-00000000006e 25052 1726882484.90961: variable 'ansible_search_path' from source: unknown 25052 1726882484.90965: variable 'ansible_search_path' from source: unknown 25052 1726882484.91010: calling self._execute() 25052 1726882484.91121: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882484.91125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882484.91137: variable 'omit' from source: magic vars 25052 1726882484.91612: variable 'ansible_distribution_major_version' from source: facts 25052 1726882484.91630: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882484.91637: _execute() done 25052 1726882484.91643: dumping result to json 25052 1726882484.91651: done dumping result, returning 25052 1726882484.91659: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-f7f6-4a6d-00000000006e] 25052 1726882484.91664: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000006e 25052 1726882484.91755: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000006e 25052 1726882484.91760: WORKER PROCESS EXITING 25052 1726882484.91810: no more pending results, returning what we have 25052 1726882484.91816: in VariableManager get_vars() 25052 1726882484.91872: Calling all_inventory to load vars for managed_node2 25052 1726882484.91875: Calling groups_inventory to load vars for managed_node2 25052 1726882484.91878: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882484.91891: Calling all_plugins_play to load vars for managed_node2 25052 1726882484.91899: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882484.91903: Calling groups_plugins_play to load vars for managed_node2 25052 1726882484.94199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882484.96475: done with get_vars() 25052 1726882484.96498: variable 'ansible_search_path' from source: unknown 25052 1726882484.96500: variable 'ansible_search_path' from source: unknown 25052 1726882484.96541: we have included files to process 25052 1726882484.96543: generating all_blocks data 25052 1726882484.96546: done generating all_blocks data 25052 1726882484.96552: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882484.96553: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882484.96555: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 25052 1726882484.97543: done processing included file 25052 1726882484.97550: iterating over new_blocks loaded from include file 25052 1726882484.97551: in VariableManager get_vars() 25052 1726882484.97577: done with get_vars() 25052 1726882484.97579: filtering new block on tags 25052 1726882484.97598: done filtering new block on tags 25052 1726882484.97602: in VariableManager get_vars() 25052 1726882484.97626: done with get_vars() 25052 1726882484.97628: filtering new block on tags 25052 1726882484.97648: done filtering new block on tags 25052 1726882484.97650: in VariableManager get_vars() 25052 1726882484.97810: done with get_vars() 25052 1726882484.97813: filtering new block on tags 25052 1726882484.97831: done filtering new block on tags 25052 1726882484.97833: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 25052 1726882484.97839: extending task lists for all hosts with included blocks 25052 1726882484.98862: done extending task lists 25052 1726882484.98864: done processing included files 25052 1726882484.98864: results queue empty 25052 1726882484.98865: checking for any_errors_fatal 25052 1726882484.98869: done checking for any_errors_fatal 25052 1726882484.98870: checking for max_fail_percentage 25052 1726882484.98871: done checking for max_fail_percentage 25052 1726882484.98872: checking to see if all hosts have failed and the running result is not ok 25052 1726882484.98872: done checking to see if all hosts have failed 25052 1726882484.98873: getting the remaining hosts for this loop 25052 1726882484.98874: done getting the remaining hosts for this loop 25052 1726882484.98877: getting the next task for host managed_node2 25052 1726882484.98882: done getting next task for host managed_node2 25052 1726882484.98884: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25052 1726882484.98887: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882484.98901: getting variables 25052 1726882484.98903: in VariableManager get_vars() 25052 1726882484.98923: Calling all_inventory to load vars for managed_node2 25052 1726882484.98926: Calling groups_inventory to load vars for managed_node2 25052 1726882484.98928: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882484.98934: Calling all_plugins_play to load vars for managed_node2 25052 1726882484.98936: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882484.98939: Calling groups_plugins_play to load vars for managed_node2 25052 1726882485.00165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882485.01859: done with get_vars() 25052 1726882485.01878: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:34:45 -0400 (0:00:00.122) 0:00:21.974 ****** 25052 1726882485.01962: entering _queue_task() for managed_node2/setup 25052 1726882485.02349: worker is 1 (out of 1 available) 25052 1726882485.02363: exiting _queue_task() for managed_node2/setup 25052 1726882485.02488: done queuing things up, now waiting for results queue to drain 25052 1726882485.02490: waiting for pending results... 25052 1726882485.02804: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 25052 1726882485.02861: in run() - task 12673a56-9f93-f7f6-4a6d-000000000513 25052 1726882485.02880: variable 'ansible_search_path' from source: unknown 25052 1726882485.02888: variable 'ansible_search_path' from source: unknown 25052 1726882485.02936: calling self._execute() 25052 1726882485.03043: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.03055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.03070: variable 'omit' from source: magic vars 25052 1726882485.03453: variable 'ansible_distribution_major_version' from source: facts 25052 1726882485.03475: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882485.03710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882485.06057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882485.06143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882485.06199: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882485.06228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882485.06260: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882485.06359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882485.06381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882485.06415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882485.06467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882485.06576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882485.06580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882485.06583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882485.06589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882485.06636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882485.06653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882485.06822: variable '__network_required_facts' from source: role '' defaults 25052 1726882485.06836: variable 'ansible_facts' from source: unknown 25052 1726882485.07585: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 25052 1726882485.07597: when evaluation is False, skipping this task 25052 1726882485.07605: _execute() done 25052 1726882485.07612: dumping result to json 25052 1726882485.07620: done dumping result, returning 25052 1726882485.07631: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-f7f6-4a6d-000000000513] 25052 1726882485.07642: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000513 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882485.07812: no more pending results, returning what we have 25052 1726882485.07816: results queue empty 25052 1726882485.07817: checking for any_errors_fatal 25052 1726882485.07819: done checking for any_errors_fatal 25052 1726882485.07819: checking for max_fail_percentage 25052 1726882485.07821: done checking for max_fail_percentage 25052 1726882485.07822: checking to see if all hosts have failed and the running result is not ok 25052 1726882485.07823: done checking to see if all hosts have failed 25052 1726882485.07824: getting the remaining hosts for this loop 25052 1726882485.07825: done getting the remaining hosts for this loop 25052 1726882485.07829: getting the next task for host managed_node2 25052 1726882485.07839: done getting next task for host managed_node2 25052 1726882485.07843: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 25052 1726882485.07848: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882485.07867: getting variables 25052 1726882485.07869: in VariableManager get_vars() 25052 1726882485.08119: Calling all_inventory to load vars for managed_node2 25052 1726882485.08123: Calling groups_inventory to load vars for managed_node2 25052 1726882485.08125: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882485.08136: Calling all_plugins_play to load vars for managed_node2 25052 1726882485.08139: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882485.08143: Calling groups_plugins_play to load vars for managed_node2 25052 1726882485.08707: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000513 25052 1726882485.08711: WORKER PROCESS EXITING 25052 1726882485.09629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882485.11168: done with get_vars() 25052 1726882485.11188: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:34:45 -0400 (0:00:00.093) 0:00:22.067 ****** 25052 1726882485.11288: entering _queue_task() for managed_node2/stat 25052 1726882485.11604: worker is 1 (out of 1 available) 25052 1726882485.11617: exiting _queue_task() for managed_node2/stat 25052 1726882485.11629: done queuing things up, now waiting for results queue to drain 25052 1726882485.11630: waiting for pending results... 25052 1726882485.11905: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 25052 1726882485.12069: in run() - task 12673a56-9f93-f7f6-4a6d-000000000515 25052 1726882485.12096: variable 'ansible_search_path' from source: unknown 25052 1726882485.12106: variable 'ansible_search_path' from source: unknown 25052 1726882485.12148: calling self._execute() 25052 1726882485.12257: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.12269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.12289: variable 'omit' from source: magic vars 25052 1726882485.12660: variable 'ansible_distribution_major_version' from source: facts 25052 1726882485.12682: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882485.12859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882485.13136: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882485.13181: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882485.13228: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882485.13265: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882485.13359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882485.13389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882485.13425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882485.13461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882485.13556: variable '__network_is_ostree' from source: set_fact 25052 1726882485.13567: Evaluated conditional (not __network_is_ostree is defined): False 25052 1726882485.13575: when evaluation is False, skipping this task 25052 1726882485.13582: _execute() done 25052 1726882485.13590: dumping result to json 25052 1726882485.13603: done dumping result, returning 25052 1726882485.13614: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-f7f6-4a6d-000000000515] 25052 1726882485.13625: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000515 25052 1726882485.13826: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000515 25052 1726882485.13829: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25052 1726882485.13882: no more pending results, returning what we have 25052 1726882485.13886: results queue empty 25052 1726882485.13887: checking for any_errors_fatal 25052 1726882485.13898: done checking for any_errors_fatal 25052 1726882485.13900: checking for max_fail_percentage 25052 1726882485.13901: done checking for max_fail_percentage 25052 1726882485.13902: checking to see if all hosts have failed and the running result is not ok 25052 1726882485.13903: done checking to see if all hosts have failed 25052 1726882485.13904: getting the remaining hosts for this loop 25052 1726882485.13905: done getting the remaining hosts for this loop 25052 1726882485.13909: getting the next task for host managed_node2 25052 1726882485.13916: done getting next task for host managed_node2 25052 1726882485.13920: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25052 1726882485.13923: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882485.13942: getting variables 25052 1726882485.13944: in VariableManager get_vars() 25052 1726882485.13985: Calling all_inventory to load vars for managed_node2 25052 1726882485.13988: Calling groups_inventory to load vars for managed_node2 25052 1726882485.14196: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882485.14205: Calling all_plugins_play to load vars for managed_node2 25052 1726882485.14209: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882485.14212: Calling groups_plugins_play to load vars for managed_node2 25052 1726882485.15679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882485.17619: done with get_vars() 25052 1726882485.17642: done getting variables 25052 1726882485.17707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:34:45 -0400 (0:00:00.064) 0:00:22.132 ****** 25052 1726882485.17772: entering _queue_task() for managed_node2/set_fact 25052 1726882485.18210: worker is 1 (out of 1 available) 25052 1726882485.18224: exiting _queue_task() for managed_node2/set_fact 25052 1726882485.18236: done queuing things up, now waiting for results queue to drain 25052 1726882485.18237: waiting for pending results... 25052 1726882485.18472: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 25052 1726882485.18627: in run() - task 12673a56-9f93-f7f6-4a6d-000000000516 25052 1726882485.18798: variable 'ansible_search_path' from source: unknown 25052 1726882485.18802: variable 'ansible_search_path' from source: unknown 25052 1726882485.18807: calling self._execute() 25052 1726882485.18810: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.18813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.18816: variable 'omit' from source: magic vars 25052 1726882485.19143: variable 'ansible_distribution_major_version' from source: facts 25052 1726882485.19160: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882485.19331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882485.19606: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882485.19658: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882485.19690: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882485.19731: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882485.19809: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882485.19842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882485.19869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882485.19898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882485.20001: variable '__network_is_ostree' from source: set_fact 25052 1726882485.20008: Evaluated conditional (not __network_is_ostree is defined): False 25052 1726882485.20011: when evaluation is False, skipping this task 25052 1726882485.20014: _execute() done 25052 1726882485.20016: dumping result to json 25052 1726882485.20020: done dumping result, returning 25052 1726882485.20028: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-f7f6-4a6d-000000000516] 25052 1726882485.20031: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000516 25052 1726882485.20164: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000516 25052 1726882485.20167: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 25052 1726882485.20333: no more pending results, returning what we have 25052 1726882485.20337: results queue empty 25052 1726882485.20337: checking for any_errors_fatal 25052 1726882485.20342: done checking for any_errors_fatal 25052 1726882485.20343: checking for max_fail_percentage 25052 1726882485.20344: done checking for max_fail_percentage 25052 1726882485.20345: checking to see if all hosts have failed and the running result is not ok 25052 1726882485.20346: done checking to see if all hosts have failed 25052 1726882485.20347: getting the remaining hosts for this loop 25052 1726882485.20348: done getting the remaining hosts for this loop 25052 1726882485.20351: getting the next task for host managed_node2 25052 1726882485.20360: done getting next task for host managed_node2 25052 1726882485.20363: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 25052 1726882485.20366: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882485.20381: getting variables 25052 1726882485.20383: in VariableManager get_vars() 25052 1726882485.20426: Calling all_inventory to load vars for managed_node2 25052 1726882485.20428: Calling groups_inventory to load vars for managed_node2 25052 1726882485.20430: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882485.20437: Calling all_plugins_play to load vars for managed_node2 25052 1726882485.20439: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882485.20440: Calling groups_plugins_play to load vars for managed_node2 25052 1726882485.21213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882485.22161: done with get_vars() 25052 1726882485.22183: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:34:45 -0400 (0:00:00.045) 0:00:22.177 ****** 25052 1726882485.22289: entering _queue_task() for managed_node2/service_facts 25052 1726882485.22659: worker is 1 (out of 1 available) 25052 1726882485.22671: exiting _queue_task() for managed_node2/service_facts 25052 1726882485.22682: done queuing things up, now waiting for results queue to drain 25052 1726882485.22684: waiting for pending results... 25052 1726882485.23112: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 25052 1726882485.23219: in run() - task 12673a56-9f93-f7f6-4a6d-000000000518 25052 1726882485.23231: variable 'ansible_search_path' from source: unknown 25052 1726882485.23234: variable 'ansible_search_path' from source: unknown 25052 1726882485.23261: calling self._execute() 25052 1726882485.23334: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.23338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.23348: variable 'omit' from source: magic vars 25052 1726882485.23611: variable 'ansible_distribution_major_version' from source: facts 25052 1726882485.23622: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882485.23625: variable 'omit' from source: magic vars 25052 1726882485.23677: variable 'omit' from source: magic vars 25052 1726882485.23703: variable 'omit' from source: magic vars 25052 1726882485.23736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882485.23763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882485.23777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882485.23795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882485.23804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882485.23827: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882485.23831: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.23833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.23905: Set connection var ansible_pipelining to False 25052 1726882485.23909: Set connection var ansible_connection to ssh 25052 1726882485.23911: Set connection var ansible_shell_type to sh 25052 1726882485.23917: Set connection var ansible_timeout to 10 25052 1726882485.23923: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882485.23928: Set connection var ansible_shell_executable to /bin/sh 25052 1726882485.23944: variable 'ansible_shell_executable' from source: unknown 25052 1726882485.23947: variable 'ansible_connection' from source: unknown 25052 1726882485.23950: variable 'ansible_module_compression' from source: unknown 25052 1726882485.23953: variable 'ansible_shell_type' from source: unknown 25052 1726882485.23957: variable 'ansible_shell_executable' from source: unknown 25052 1726882485.23959: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882485.23961: variable 'ansible_pipelining' from source: unknown 25052 1726882485.23964: variable 'ansible_timeout' from source: unknown 25052 1726882485.23966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882485.24109: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882485.24115: variable 'omit' from source: magic vars 25052 1726882485.24120: starting attempt loop 25052 1726882485.24123: running the handler 25052 1726882485.24134: _low_level_execute_command(): starting 25052 1726882485.24141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882485.24656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882485.24662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882485.24665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.24718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882485.24722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882485.24728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882485.24796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882485.26455: stdout chunk (state=3): >>>/root <<< 25052 1726882485.26558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882485.26582: stderr chunk (state=3): >>><<< 25052 1726882485.26585: stdout chunk (state=3): >>><<< 25052 1726882485.26608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882485.26618: _low_level_execute_command(): starting 25052 1726882485.26626: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753 `" && echo ansible-tmp-1726882485.2660751-26080-5864160511753="` echo /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753 `" ) && sleep 0' 25052 1726882485.27042: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882485.27045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882485.27048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882485.27057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.27099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882485.27103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882485.27168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882485.29052: stdout chunk (state=3): >>>ansible-tmp-1726882485.2660751-26080-5864160511753=/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753 <<< 25052 1726882485.29209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882485.29213: stdout chunk (state=3): >>><<< 25052 1726882485.29215: stderr chunk (state=3): >>><<< 25052 1726882485.29265: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882485.2660751-26080-5864160511753=/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882485.29398: variable 'ansible_module_compression' from source: unknown 25052 1726882485.29401: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 25052 1726882485.29425: variable 'ansible_facts' from source: unknown 25052 1726882485.29481: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py 25052 1726882485.29588: Sending initial data 25052 1726882485.29596: Sent initial data (160 bytes) 25052 1726882485.30016: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882485.30021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882485.30023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.30025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882485.30027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.30077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882485.30084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882485.30144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882485.31675: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882485.31742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882485.31809: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmplpjgdbb7 /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py <<< 25052 1726882485.31811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py" <<< 25052 1726882485.31866: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmplpjgdbb7" to remote "/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py" <<< 25052 1726882485.31873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py" <<< 25052 1726882485.32490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882485.32527: stderr chunk (state=3): >>><<< 25052 1726882485.32530: stdout chunk (state=3): >>><<< 25052 1726882485.32544: done transferring module to remote 25052 1726882485.32553: _low_level_execute_command(): starting 25052 1726882485.32558: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/ /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py && sleep 0' 25052 1726882485.32955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882485.32988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882485.32991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882485.32995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.32999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882485.33002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882485.33007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.33049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882485.33052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882485.33126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882485.34856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882485.34868: stderr chunk (state=3): >>><<< 25052 1726882485.34871: stdout chunk (state=3): >>><<< 25052 1726882485.34884: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882485.34887: _low_level_execute_command(): starting 25052 1726882485.34892: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/AnsiballZ_service_facts.py && sleep 0' 25052 1726882485.35291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882485.35323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882485.35326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882485.35328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882485.35331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882485.35382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882485.35388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882485.35394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882485.35450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882486.86081: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 25052 1726882486.86153: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 25052 1726882486.86174: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 25052 1726882486.87612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882486.87704: stderr chunk (state=3): >>><<< 25052 1726882486.87707: stdout chunk (state=3): >>><<< 25052 1726882486.87711: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882486.89368: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882486.89375: _low_level_execute_command(): starting 25052 1726882486.89380: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882485.2660751-26080-5864160511753/ > /dev/null 2>&1 && sleep 0' 25052 1726882486.90179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882486.90187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882486.90206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882486.90256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882486.90259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882486.90262: stderr chunk (state=3): >>>debug2: match not found <<< 25052 1726882486.90265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882486.90267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882486.90269: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882486.90271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 25052 1726882486.90299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882486.90302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882486.90305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882486.90406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882486.90409: stderr chunk (state=3): >>>debug2: match found <<< 25052 1726882486.90411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882486.90413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882486.90415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882486.90462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882486.90532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882486.92999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882486.93003: stdout chunk (state=3): >>><<< 25052 1726882486.93006: stderr chunk (state=3): >>><<< 25052 1726882486.93008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882486.93012: handler run complete 25052 1726882486.93475: variable 'ansible_facts' from source: unknown 25052 1726882486.93678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882486.94745: variable 'ansible_facts' from source: unknown 25052 1726882486.94942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882486.95203: attempt loop complete, returning result 25052 1726882486.95214: _execute() done 25052 1726882486.95224: dumping result to json 25052 1726882486.95295: done dumping result, returning 25052 1726882486.95311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-f7f6-4a6d-000000000518] 25052 1726882486.95321: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000518 25052 1726882486.96720: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000518 25052 1726882486.96723: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882486.96846: no more pending results, returning what we have 25052 1726882486.96849: results queue empty 25052 1726882486.96850: checking for any_errors_fatal 25052 1726882486.96853: done checking for any_errors_fatal 25052 1726882486.96853: checking for max_fail_percentage 25052 1726882486.96855: done checking for max_fail_percentage 25052 1726882486.96855: checking to see if all hosts have failed and the running result is not ok 25052 1726882486.96856: done checking to see if all hosts have failed 25052 1726882486.96857: getting the remaining hosts for this loop 25052 1726882486.96858: done getting the remaining hosts for this loop 25052 1726882486.96861: getting the next task for host managed_node2 25052 1726882486.96866: done getting next task for host managed_node2 25052 1726882486.96871: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 25052 1726882486.96876: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882486.96886: getting variables 25052 1726882486.96887: in VariableManager get_vars() 25052 1726882486.96919: Calling all_inventory to load vars for managed_node2 25052 1726882486.96922: Calling groups_inventory to load vars for managed_node2 25052 1726882486.96924: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882486.96932: Calling all_plugins_play to load vars for managed_node2 25052 1726882486.96940: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882486.96943: Calling groups_plugins_play to load vars for managed_node2 25052 1726882486.99462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882487.03011: done with get_vars() 25052 1726882487.03041: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:34:47 -0400 (0:00:01.809) 0:00:23.987 ****** 25052 1726882487.03264: entering _queue_task() for managed_node2/package_facts 25052 1726882487.04147: worker is 1 (out of 1 available) 25052 1726882487.04161: exiting _queue_task() for managed_node2/package_facts 25052 1726882487.04173: done queuing things up, now waiting for results queue to drain 25052 1726882487.04175: waiting for pending results... 25052 1726882487.04712: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 25052 1726882487.04954: in run() - task 12673a56-9f93-f7f6-4a6d-000000000519 25052 1726882487.04968: variable 'ansible_search_path' from source: unknown 25052 1726882487.04972: variable 'ansible_search_path' from source: unknown 25052 1726882487.05008: calling self._execute() 25052 1726882487.05278: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.05282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.05298: variable 'omit' from source: magic vars 25052 1726882487.06000: variable 'ansible_distribution_major_version' from source: facts 25052 1726882487.06120: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882487.06154: variable 'omit' from source: magic vars 25052 1726882487.06452: variable 'omit' from source: magic vars 25052 1726882487.06455: variable 'omit' from source: magic vars 25052 1726882487.06458: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882487.06551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882487.06665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882487.06682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882487.06698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882487.06727: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882487.06731: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.06733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.06995: Set connection var ansible_pipelining to False 25052 1726882487.06999: Set connection var ansible_connection to ssh 25052 1726882487.07001: Set connection var ansible_shell_type to sh 25052 1726882487.07089: Set connection var ansible_timeout to 10 25052 1726882487.07100: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882487.07105: Set connection var ansible_shell_executable to /bin/sh 25052 1726882487.07186: variable 'ansible_shell_executable' from source: unknown 25052 1726882487.07189: variable 'ansible_connection' from source: unknown 25052 1726882487.07196: variable 'ansible_module_compression' from source: unknown 25052 1726882487.07199: variable 'ansible_shell_type' from source: unknown 25052 1726882487.07201: variable 'ansible_shell_executable' from source: unknown 25052 1726882487.07206: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.07209: variable 'ansible_pipelining' from source: unknown 25052 1726882487.07212: variable 'ansible_timeout' from source: unknown 25052 1726882487.07214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.07618: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882487.07700: variable 'omit' from source: magic vars 25052 1726882487.07703: starting attempt loop 25052 1726882487.07706: running the handler 25052 1726882487.07708: _low_level_execute_command(): starting 25052 1726882487.07710: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882487.08335: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882487.08346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882487.08358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882487.08372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882487.08385: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882487.08505: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.08509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.08512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.08556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.08627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.10505: stdout chunk (state=3): >>>/root <<< 25052 1726882487.10528: stdout chunk (state=3): >>><<< 25052 1726882487.10553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882487.10608: stderr chunk (state=3): >>><<< 25052 1726882487.10659: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882487.10679: _low_level_execute_command(): starting 25052 1726882487.10875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104 `" && echo ansible-tmp-1726882487.1066613-26164-150686338731104="` echo /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104 `" ) && sleep 0' 25052 1726882487.11838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882487.11841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882487.11851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.11854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882487.11857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.12002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.12009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.12033: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.12131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.14114: stdout chunk (state=3): >>>ansible-tmp-1726882487.1066613-26164-150686338731104=/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104 <<< 25052 1726882487.14341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882487.14374: stderr chunk (state=3): >>><<< 25052 1726882487.14378: stdout chunk (state=3): >>><<< 25052 1726882487.14538: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882487.1066613-26164-150686338731104=/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882487.14567: variable 'ansible_module_compression' from source: unknown 25052 1726882487.14617: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 25052 1726882487.14679: variable 'ansible_facts' from source: unknown 25052 1726882487.15058: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py 25052 1726882487.15547: Sending initial data 25052 1726882487.15550: Sent initial data (162 bytes) 25052 1726882487.16400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882487.16404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.16413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.16519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.16522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.18180: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882487.18184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882487.18235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpcljbi7zc /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py <<< 25052 1726882487.18243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py" <<< 25052 1726882487.18336: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpcljbi7zc" to remote "/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py" <<< 25052 1726882487.20573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882487.20576: stdout chunk (state=3): >>><<< 25052 1726882487.20579: stderr chunk (state=3): >>><<< 25052 1726882487.20588: done transferring module to remote 25052 1726882487.20604: _low_level_execute_command(): starting 25052 1726882487.20614: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/ /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py && sleep 0' 25052 1726882487.21241: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882487.21254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882487.21269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882487.21287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882487.21308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882487.21339: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882487.21410: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.21454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.21468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.21491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.21578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.23408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882487.23465: stderr chunk (state=3): >>><<< 25052 1726882487.23468: stdout chunk (state=3): >>><<< 25052 1726882487.23484: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882487.23567: _low_level_execute_command(): starting 25052 1726882487.23571: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/AnsiballZ_package_facts.py && sleep 0' 25052 1726882487.24198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.24212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.24226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.24322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.68244: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 25052 1726882487.68270: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 25052 1726882487.68276: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 25052 1726882487.68305: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 25052 1726882487.68317: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 25052 1726882487.68360: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 25052 1726882487.68377: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 25052 1726882487.68405: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 25052 1726882487.68413: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 25052 1726882487.68446: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 25052 1726882487.68452: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 25052 1726882487.70297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882487.70301: stdout chunk (state=3): >>><<< 25052 1726882487.70303: stderr chunk (state=3): >>><<< 25052 1726882487.70505: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882487.71836: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882487.71851: _low_level_execute_command(): starting 25052 1726882487.71855: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882487.1066613-26164-150686338731104/ > /dev/null 2>&1 && sleep 0' 25052 1726882487.72275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882487.72283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882487.72311: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.72314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882487.72316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882487.72373: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882487.72380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882487.72383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882487.72443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882487.74442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882487.74446: stdout chunk (state=3): >>><<< 25052 1726882487.74448: stderr chunk (state=3): >>><<< 25052 1726882487.74451: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882487.74453: handler run complete 25052 1726882487.75318: variable 'ansible_facts' from source: unknown 25052 1726882487.75887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882487.77911: variable 'ansible_facts' from source: unknown 25052 1726882487.78399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882487.79301: attempt loop complete, returning result 25052 1726882487.79311: _execute() done 25052 1726882487.79314: dumping result to json 25052 1726882487.79516: done dumping result, returning 25052 1726882487.79700: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-f7f6-4a6d-000000000519] 25052 1726882487.79703: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000519 25052 1726882487.82015: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000519 25052 1726882487.82018: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882487.82166: no more pending results, returning what we have 25052 1726882487.82168: results queue empty 25052 1726882487.82169: checking for any_errors_fatal 25052 1726882487.82173: done checking for any_errors_fatal 25052 1726882487.82174: checking for max_fail_percentage 25052 1726882487.82175: done checking for max_fail_percentage 25052 1726882487.82176: checking to see if all hosts have failed and the running result is not ok 25052 1726882487.82177: done checking to see if all hosts have failed 25052 1726882487.82177: getting the remaining hosts for this loop 25052 1726882487.82178: done getting the remaining hosts for this loop 25052 1726882487.82181: getting the next task for host managed_node2 25052 1726882487.82188: done getting next task for host managed_node2 25052 1726882487.82190: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 25052 1726882487.82196: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882487.82206: getting variables 25052 1726882487.82207: in VariableManager get_vars() 25052 1726882487.82241: Calling all_inventory to load vars for managed_node2 25052 1726882487.82243: Calling groups_inventory to load vars for managed_node2 25052 1726882487.82245: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882487.82253: Calling all_plugins_play to load vars for managed_node2 25052 1726882487.82255: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882487.82258: Calling groups_plugins_play to load vars for managed_node2 25052 1726882487.83428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882487.85140: done with get_vars() 25052 1726882487.85163: done getting variables 25052 1726882487.85222: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:34:47 -0400 (0:00:00.819) 0:00:24.807 ****** 25052 1726882487.85255: entering _queue_task() for managed_node2/debug 25052 1726882487.85570: worker is 1 (out of 1 available) 25052 1726882487.85584: exiting _queue_task() for managed_node2/debug 25052 1726882487.85598: done queuing things up, now waiting for results queue to drain 25052 1726882487.85599: waiting for pending results... 25052 1726882487.86013: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 25052 1726882487.86038: in run() - task 12673a56-9f93-f7f6-4a6d-00000000006f 25052 1726882487.86061: variable 'ansible_search_path' from source: unknown 25052 1726882487.86069: variable 'ansible_search_path' from source: unknown 25052 1726882487.86111: calling self._execute() 25052 1726882487.86219: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.86231: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.86252: variable 'omit' from source: magic vars 25052 1726882487.86623: variable 'ansible_distribution_major_version' from source: facts 25052 1726882487.86641: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882487.86653: variable 'omit' from source: magic vars 25052 1726882487.86786: variable 'omit' from source: magic vars 25052 1726882487.86824: variable 'network_provider' from source: set_fact 25052 1726882487.86848: variable 'omit' from source: magic vars 25052 1726882487.86898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882487.86938: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882487.86964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882487.86985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882487.87008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882487.87046: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882487.87055: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.87063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.87172: Set connection var ansible_pipelining to False 25052 1726882487.87181: Set connection var ansible_connection to ssh 25052 1726882487.87216: Set connection var ansible_shell_type to sh 25052 1726882487.87221: Set connection var ansible_timeout to 10 25052 1726882487.87223: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882487.87228: Set connection var ansible_shell_executable to /bin/sh 25052 1726882487.87254: variable 'ansible_shell_executable' from source: unknown 25052 1726882487.87264: variable 'ansible_connection' from source: unknown 25052 1726882487.87325: variable 'ansible_module_compression' from source: unknown 25052 1726882487.87329: variable 'ansible_shell_type' from source: unknown 25052 1726882487.87331: variable 'ansible_shell_executable' from source: unknown 25052 1726882487.87333: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.87335: variable 'ansible_pipelining' from source: unknown 25052 1726882487.87336: variable 'ansible_timeout' from source: unknown 25052 1726882487.87338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.87453: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882487.87472: variable 'omit' from source: magic vars 25052 1726882487.87482: starting attempt loop 25052 1726882487.87488: running the handler 25052 1726882487.87544: handler run complete 25052 1726882487.87565: attempt loop complete, returning result 25052 1726882487.87598: _execute() done 25052 1726882487.87602: dumping result to json 25052 1726882487.87604: done dumping result, returning 25052 1726882487.87607: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-f7f6-4a6d-00000000006f] 25052 1726882487.87609: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000006f 25052 1726882487.87840: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000006f 25052 1726882487.87843: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 25052 1726882487.87904: no more pending results, returning what we have 25052 1726882487.87908: results queue empty 25052 1726882487.87909: checking for any_errors_fatal 25052 1726882487.87918: done checking for any_errors_fatal 25052 1726882487.87919: checking for max_fail_percentage 25052 1726882487.87920: done checking for max_fail_percentage 25052 1726882487.87921: checking to see if all hosts have failed and the running result is not ok 25052 1726882487.87922: done checking to see if all hosts have failed 25052 1726882487.87923: getting the remaining hosts for this loop 25052 1726882487.87924: done getting the remaining hosts for this loop 25052 1726882487.87927: getting the next task for host managed_node2 25052 1726882487.87934: done getting next task for host managed_node2 25052 1726882487.87937: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25052 1726882487.87940: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882487.87951: getting variables 25052 1726882487.87953: in VariableManager get_vars() 25052 1726882487.87992: Calling all_inventory to load vars for managed_node2 25052 1726882487.87996: Calling groups_inventory to load vars for managed_node2 25052 1726882487.87999: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882487.88009: Calling all_plugins_play to load vars for managed_node2 25052 1726882487.88012: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882487.88014: Calling groups_plugins_play to load vars for managed_node2 25052 1726882487.93523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882487.95059: done with get_vars() 25052 1726882487.95083: done getting variables 25052 1726882487.95133: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:34:47 -0400 (0:00:00.099) 0:00:24.906 ****** 25052 1726882487.95163: entering _queue_task() for managed_node2/fail 25052 1726882487.95503: worker is 1 (out of 1 available) 25052 1726882487.95515: exiting _queue_task() for managed_node2/fail 25052 1726882487.95526: done queuing things up, now waiting for results queue to drain 25052 1726882487.95527: waiting for pending results... 25052 1726882487.95915: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 25052 1726882487.95958: in run() - task 12673a56-9f93-f7f6-4a6d-000000000070 25052 1726882487.95977: variable 'ansible_search_path' from source: unknown 25052 1726882487.95986: variable 'ansible_search_path' from source: unknown 25052 1726882487.96031: calling self._execute() 25052 1726882487.96133: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882487.96144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882487.96158: variable 'omit' from source: magic vars 25052 1726882487.96555: variable 'ansible_distribution_major_version' from source: facts 25052 1726882487.96572: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882487.96707: variable 'network_state' from source: role '' defaults 25052 1726882487.96722: Evaluated conditional (network_state != {}): False 25052 1726882487.96730: when evaluation is False, skipping this task 25052 1726882487.96737: _execute() done 25052 1726882487.96744: dumping result to json 25052 1726882487.96751: done dumping result, returning 25052 1726882487.96762: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-f7f6-4a6d-000000000070] 25052 1726882487.96775: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000070 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882487.96922: no more pending results, returning what we have 25052 1726882487.96926: results queue empty 25052 1726882487.96927: checking for any_errors_fatal 25052 1726882487.96934: done checking for any_errors_fatal 25052 1726882487.96934: checking for max_fail_percentage 25052 1726882487.96936: done checking for max_fail_percentage 25052 1726882487.96937: checking to see if all hosts have failed and the running result is not ok 25052 1726882487.96938: done checking to see if all hosts have failed 25052 1726882487.96938: getting the remaining hosts for this loop 25052 1726882487.96940: done getting the remaining hosts for this loop 25052 1726882487.96943: getting the next task for host managed_node2 25052 1726882487.96950: done getting next task for host managed_node2 25052 1726882487.96954: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25052 1726882487.96958: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882487.96978: getting variables 25052 1726882487.96980: in VariableManager get_vars() 25052 1726882487.97020: Calling all_inventory to load vars for managed_node2 25052 1726882487.97024: Calling groups_inventory to load vars for managed_node2 25052 1726882487.97026: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882487.97037: Calling all_plugins_play to load vars for managed_node2 25052 1726882487.97040: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882487.97043: Calling groups_plugins_play to load vars for managed_node2 25052 1726882487.97706: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000070 25052 1726882487.97710: WORKER PROCESS EXITING 25052 1726882487.99123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.00761: done with get_vars() 25052 1726882488.00789: done getting variables 25052 1726882488.00852: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:34:48 -0400 (0:00:00.057) 0:00:24.963 ****** 25052 1726882488.00888: entering _queue_task() for managed_node2/fail 25052 1726882488.01228: worker is 1 (out of 1 available) 25052 1726882488.01242: exiting _queue_task() for managed_node2/fail 25052 1726882488.01253: done queuing things up, now waiting for results queue to drain 25052 1726882488.01255: waiting for pending results... 25052 1726882488.01544: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 25052 1726882488.01720: in run() - task 12673a56-9f93-f7f6-4a6d-000000000071 25052 1726882488.01724: variable 'ansible_search_path' from source: unknown 25052 1726882488.01727: variable 'ansible_search_path' from source: unknown 25052 1726882488.01745: calling self._execute() 25052 1726882488.01846: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.01856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.01867: variable 'omit' from source: magic vars 25052 1726882488.02262: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.02265: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.02367: variable 'network_state' from source: role '' defaults 25052 1726882488.02390: Evaluated conditional (network_state != {}): False 25052 1726882488.02481: when evaluation is False, skipping this task 25052 1726882488.02485: _execute() done 25052 1726882488.02487: dumping result to json 25052 1726882488.02489: done dumping result, returning 25052 1726882488.02492: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-f7f6-4a6d-000000000071] 25052 1726882488.02496: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000071 25052 1726882488.02564: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000071 25052 1726882488.02568: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882488.02617: no more pending results, returning what we have 25052 1726882488.02621: results queue empty 25052 1726882488.02622: checking for any_errors_fatal 25052 1726882488.02629: done checking for any_errors_fatal 25052 1726882488.02630: checking for max_fail_percentage 25052 1726882488.02632: done checking for max_fail_percentage 25052 1726882488.02633: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.02634: done checking to see if all hosts have failed 25052 1726882488.02634: getting the remaining hosts for this loop 25052 1726882488.02636: done getting the remaining hosts for this loop 25052 1726882488.02639: getting the next task for host managed_node2 25052 1726882488.02645: done getting next task for host managed_node2 25052 1726882488.02649: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25052 1726882488.02652: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.02672: getting variables 25052 1726882488.02674: in VariableManager get_vars() 25052 1726882488.02917: Calling all_inventory to load vars for managed_node2 25052 1726882488.02920: Calling groups_inventory to load vars for managed_node2 25052 1726882488.02922: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.02931: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.02934: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.02937: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.04338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.05819: done with get_vars() 25052 1726882488.05843: done getting variables 25052 1726882488.05902: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:34:48 -0400 (0:00:00.050) 0:00:25.014 ****** 25052 1726882488.05936: entering _queue_task() for managed_node2/fail 25052 1726882488.06256: worker is 1 (out of 1 available) 25052 1726882488.06269: exiting _queue_task() for managed_node2/fail 25052 1726882488.06279: done queuing things up, now waiting for results queue to drain 25052 1726882488.06280: waiting for pending results... 25052 1726882488.06813: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 25052 1726882488.06820: in run() - task 12673a56-9f93-f7f6-4a6d-000000000072 25052 1726882488.06824: variable 'ansible_search_path' from source: unknown 25052 1726882488.06828: variable 'ansible_search_path' from source: unknown 25052 1726882488.06831: calling self._execute() 25052 1726882488.06999: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.07003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.07006: variable 'omit' from source: magic vars 25052 1726882488.07398: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.07402: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.07417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.10087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.10326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.10361: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.10512: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.10537: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.10729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.10758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.10782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.10824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.10840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.11055: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.11070: Evaluated conditional (ansible_distribution_major_version | int > 9): True 25052 1726882488.11456: variable 'ansible_distribution' from source: facts 25052 1726882488.11459: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.11470: Evaluated conditional (ansible_distribution in __network_rh_distros): True 25052 1726882488.11968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.11996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.12022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.12059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.12072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.12250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.12271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.12298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.12404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.12420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.12575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.12599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.12623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.12774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.12789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.13452: variable 'network_connections' from source: task vars 25052 1726882488.13461: variable 'interface' from source: play vars 25052 1726882488.13602: variable 'interface' from source: play vars 25052 1726882488.13613: variable 'network_state' from source: role '' defaults 25052 1726882488.13898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.14023: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.14058: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.14204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.14231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.14272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.14408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.14434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.14458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.14481: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 25052 1726882488.14485: when evaluation is False, skipping this task 25052 1726882488.14487: _execute() done 25052 1726882488.14490: dumping result to json 25052 1726882488.14496: done dumping result, returning 25052 1726882488.14621: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-f7f6-4a6d-000000000072] 25052 1726882488.14625: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000072 25052 1726882488.14728: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000072 25052 1726882488.14732: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 25052 1726882488.14782: no more pending results, returning what we have 25052 1726882488.14785: results queue empty 25052 1726882488.14786: checking for any_errors_fatal 25052 1726882488.14795: done checking for any_errors_fatal 25052 1726882488.14796: checking for max_fail_percentage 25052 1726882488.14798: done checking for max_fail_percentage 25052 1726882488.14799: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.14800: done checking to see if all hosts have failed 25052 1726882488.14801: getting the remaining hosts for this loop 25052 1726882488.14802: done getting the remaining hosts for this loop 25052 1726882488.14806: getting the next task for host managed_node2 25052 1726882488.14813: done getting next task for host managed_node2 25052 1726882488.14817: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25052 1726882488.14821: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.14839: getting variables 25052 1726882488.14841: in VariableManager get_vars() 25052 1726882488.14883: Calling all_inventory to load vars for managed_node2 25052 1726882488.14886: Calling groups_inventory to load vars for managed_node2 25052 1726882488.14889: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.15003: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.15008: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.15011: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.17848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.20341: done with get_vars() 25052 1726882488.20402: done getting variables 25052 1726882488.20461: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:34:48 -0400 (0:00:00.145) 0:00:25.159 ****** 25052 1726882488.20500: entering _queue_task() for managed_node2/dnf 25052 1726882488.21042: worker is 1 (out of 1 available) 25052 1726882488.21054: exiting _queue_task() for managed_node2/dnf 25052 1726882488.21063: done queuing things up, now waiting for results queue to drain 25052 1726882488.21064: waiting for pending results... 25052 1726882488.21306: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 25052 1726882488.21453: in run() - task 12673a56-9f93-f7f6-4a6d-000000000073 25052 1726882488.21458: variable 'ansible_search_path' from source: unknown 25052 1726882488.21460: variable 'ansible_search_path' from source: unknown 25052 1726882488.21505: calling self._execute() 25052 1726882488.21617: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.21632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.21649: variable 'omit' from source: magic vars 25052 1726882488.22038: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.22055: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.22328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.24913: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.25008: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.25053: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.25097: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.25132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.25262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.25266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.25292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.25342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.25362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.25504: variable 'ansible_distribution' from source: facts 25052 1726882488.25514: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.25590: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 25052 1726882488.25661: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.25800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.25836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.25864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.25915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.25941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.25983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.26012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.26103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.26136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.26214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.26514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.26517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.26519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.26550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.26566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.26847: variable 'network_connections' from source: task vars 25052 1726882488.26924: variable 'interface' from source: play vars 25052 1726882488.26997: variable 'interface' from source: play vars 25052 1726882488.27081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.27323: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.27561: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.27564: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.27566: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.27687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.27722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.27760: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.28002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.28006: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882488.28417: variable 'network_connections' from source: task vars 25052 1726882488.28431: variable 'interface' from source: play vars 25052 1726882488.28586: variable 'interface' from source: play vars 25052 1726882488.28643: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882488.28688: when evaluation is False, skipping this task 25052 1726882488.28708: _execute() done 25052 1726882488.28716: dumping result to json 25052 1726882488.28744: done dumping result, returning 25052 1726882488.28764: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-000000000073] 25052 1726882488.28819: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000073 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882488.29143: no more pending results, returning what we have 25052 1726882488.29147: results queue empty 25052 1726882488.29148: checking for any_errors_fatal 25052 1726882488.29156: done checking for any_errors_fatal 25052 1726882488.29157: checking for max_fail_percentage 25052 1726882488.29159: done checking for max_fail_percentage 25052 1726882488.29160: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.29161: done checking to see if all hosts have failed 25052 1726882488.29162: getting the remaining hosts for this loop 25052 1726882488.29163: done getting the remaining hosts for this loop 25052 1726882488.29167: getting the next task for host managed_node2 25052 1726882488.29174: done getting next task for host managed_node2 25052 1726882488.29178: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25052 1726882488.29182: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.29206: getting variables 25052 1726882488.29209: in VariableManager get_vars() 25052 1726882488.29252: Calling all_inventory to load vars for managed_node2 25052 1726882488.29256: Calling groups_inventory to load vars for managed_node2 25052 1726882488.29258: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.29269: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.29272: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.29275: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.29806: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000073 25052 1726882488.29809: WORKER PROCESS EXITING 25052 1726882488.31512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.33716: done with get_vars() 25052 1726882488.33783: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 25052 1726882488.33867: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:34:48 -0400 (0:00:00.134) 0:00:25.293 ****** 25052 1726882488.33906: entering _queue_task() for managed_node2/yum 25052 1726882488.34159: worker is 1 (out of 1 available) 25052 1726882488.34172: exiting _queue_task() for managed_node2/yum 25052 1726882488.34184: done queuing things up, now waiting for results queue to drain 25052 1726882488.34185: waiting for pending results... 25052 1726882488.34372: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 25052 1726882488.34482: in run() - task 12673a56-9f93-f7f6-4a6d-000000000074 25052 1726882488.34486: variable 'ansible_search_path' from source: unknown 25052 1726882488.34488: variable 'ansible_search_path' from source: unknown 25052 1726882488.34523: calling self._execute() 25052 1726882488.34590: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.34598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.34606: variable 'omit' from source: magic vars 25052 1726882488.34875: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.34884: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.35002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.37229: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.37234: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.37256: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.37295: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.37322: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.37381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.37403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.37421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.37450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.37461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.37529: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.37541: Evaluated conditional (ansible_distribution_major_version | int < 8): False 25052 1726882488.37544: when evaluation is False, skipping this task 25052 1726882488.37548: _execute() done 25052 1726882488.37551: dumping result to json 25052 1726882488.37553: done dumping result, returning 25052 1726882488.37561: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-000000000074] 25052 1726882488.37564: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000074 25052 1726882488.37649: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000074 25052 1726882488.37651: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 25052 1726882488.37714: no more pending results, returning what we have 25052 1726882488.37718: results queue empty 25052 1726882488.37718: checking for any_errors_fatal 25052 1726882488.37726: done checking for any_errors_fatal 25052 1726882488.37727: checking for max_fail_percentage 25052 1726882488.37728: done checking for max_fail_percentage 25052 1726882488.37729: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.37730: done checking to see if all hosts have failed 25052 1726882488.37731: getting the remaining hosts for this loop 25052 1726882488.37732: done getting the remaining hosts for this loop 25052 1726882488.37735: getting the next task for host managed_node2 25052 1726882488.37742: done getting next task for host managed_node2 25052 1726882488.37745: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25052 1726882488.37748: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.37766: getting variables 25052 1726882488.37767: in VariableManager get_vars() 25052 1726882488.37808: Calling all_inventory to load vars for managed_node2 25052 1726882488.37811: Calling groups_inventory to load vars for managed_node2 25052 1726882488.37813: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.37821: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.37824: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.37826: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.38750: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.39607: done with get_vars() 25052 1726882488.39623: done getting variables 25052 1726882488.39666: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:34:48 -0400 (0:00:00.057) 0:00:25.351 ****** 25052 1726882488.39688: entering _queue_task() for managed_node2/fail 25052 1726882488.39918: worker is 1 (out of 1 available) 25052 1726882488.39933: exiting _queue_task() for managed_node2/fail 25052 1726882488.39942: done queuing things up, now waiting for results queue to drain 25052 1726882488.39944: waiting for pending results... 25052 1726882488.40127: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 25052 1726882488.40214: in run() - task 12673a56-9f93-f7f6-4a6d-000000000075 25052 1726882488.40226: variable 'ansible_search_path' from source: unknown 25052 1726882488.40230: variable 'ansible_search_path' from source: unknown 25052 1726882488.40257: calling self._execute() 25052 1726882488.40336: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.40340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.40349: variable 'omit' from source: magic vars 25052 1726882488.40617: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.40626: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.40708: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.40841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.42308: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.42358: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.42384: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.42414: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.42435: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.42492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.42515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.42532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.42557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.42570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.42606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.42622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.42638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.42662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.42678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.42707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.42722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.42738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.42761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.42772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.42881: variable 'network_connections' from source: task vars 25052 1726882488.42896: variable 'interface' from source: play vars 25052 1726882488.42940: variable 'interface' from source: play vars 25052 1726882488.42987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.43094: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.43132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.43154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.43399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.43402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.43405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.43407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.43409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.43411: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882488.43542: variable 'network_connections' from source: task vars 25052 1726882488.43546: variable 'interface' from source: play vars 25052 1726882488.43604: variable 'interface' from source: play vars 25052 1726882488.43625: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882488.43634: when evaluation is False, skipping this task 25052 1726882488.43640: _execute() done 25052 1726882488.43643: dumping result to json 25052 1726882488.43645: done dumping result, returning 25052 1726882488.43647: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-000000000075] 25052 1726882488.43649: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000075 25052 1726882488.43730: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000075 25052 1726882488.43733: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882488.43790: no more pending results, returning what we have 25052 1726882488.43796: results queue empty 25052 1726882488.43796: checking for any_errors_fatal 25052 1726882488.43802: done checking for any_errors_fatal 25052 1726882488.43803: checking for max_fail_percentage 25052 1726882488.43805: done checking for max_fail_percentage 25052 1726882488.43805: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.43806: done checking to see if all hosts have failed 25052 1726882488.43807: getting the remaining hosts for this loop 25052 1726882488.43808: done getting the remaining hosts for this loop 25052 1726882488.43811: getting the next task for host managed_node2 25052 1726882488.43817: done getting next task for host managed_node2 25052 1726882488.43821: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 25052 1726882488.43824: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.43842: getting variables 25052 1726882488.43843: in VariableManager get_vars() 25052 1726882488.43881: Calling all_inventory to load vars for managed_node2 25052 1726882488.43884: Calling groups_inventory to load vars for managed_node2 25052 1726882488.43887: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.43897: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.43900: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.43902: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.45062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.46160: done with get_vars() 25052 1726882488.46175: done getting variables 25052 1726882488.46219: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:34:48 -0400 (0:00:00.065) 0:00:25.417 ****** 25052 1726882488.46243: entering _queue_task() for managed_node2/package 25052 1726882488.46467: worker is 1 (out of 1 available) 25052 1726882488.46481: exiting _queue_task() for managed_node2/package 25052 1726882488.46494: done queuing things up, now waiting for results queue to drain 25052 1726882488.46496: waiting for pending results... 25052 1726882488.46668: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 25052 1726882488.46757: in run() - task 12673a56-9f93-f7f6-4a6d-000000000076 25052 1726882488.46767: variable 'ansible_search_path' from source: unknown 25052 1726882488.46770: variable 'ansible_search_path' from source: unknown 25052 1726882488.46800: calling self._execute() 25052 1726882488.46873: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.46876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.46885: variable 'omit' from source: magic vars 25052 1726882488.47141: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.47149: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.47280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.47465: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.47499: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.47524: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.47572: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.47650: variable 'network_packages' from source: role '' defaults 25052 1726882488.47722: variable '__network_provider_setup' from source: role '' defaults 25052 1726882488.47731: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882488.47775: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882488.47783: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882488.47831: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882488.47940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.49443: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.49484: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.49512: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.49534: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.49555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.49611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.49631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.49649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.49676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.49687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.49721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.49737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.49753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.49781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.49790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.49924: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25052 1726882488.50002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.50027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.50045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.50072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.50082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.50146: variable 'ansible_python' from source: facts 25052 1726882488.50167: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25052 1726882488.50224: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882488.50278: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882488.50361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.50377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.50397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.50421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.50436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.50464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.50483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.50501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.50525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.50535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.50627: variable 'network_connections' from source: task vars 25052 1726882488.50633: variable 'interface' from source: play vars 25052 1726882488.50703: variable 'interface' from source: play vars 25052 1726882488.50749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.50770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.50790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.50814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.50849: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.51019: variable 'network_connections' from source: task vars 25052 1726882488.51022: variable 'interface' from source: play vars 25052 1726882488.51095: variable 'interface' from source: play vars 25052 1726882488.51117: variable '__network_packages_default_wireless' from source: role '' defaults 25052 1726882488.51169: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.51361: variable 'network_connections' from source: task vars 25052 1726882488.51364: variable 'interface' from source: play vars 25052 1726882488.51412: variable 'interface' from source: play vars 25052 1726882488.51429: variable '__network_packages_default_team' from source: role '' defaults 25052 1726882488.51480: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882488.51670: variable 'network_connections' from source: task vars 25052 1726882488.51673: variable 'interface' from source: play vars 25052 1726882488.51720: variable 'interface' from source: play vars 25052 1726882488.51757: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882488.51799: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882488.51806: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882488.51849: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882488.52000: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25052 1726882488.52281: variable 'network_connections' from source: task vars 25052 1726882488.52284: variable 'interface' from source: play vars 25052 1726882488.52330: variable 'interface' from source: play vars 25052 1726882488.52336: variable 'ansible_distribution' from source: facts 25052 1726882488.52339: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.52345: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.52356: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25052 1726882488.52462: variable 'ansible_distribution' from source: facts 25052 1726882488.52466: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.52469: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.52480: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25052 1726882488.52584: variable 'ansible_distribution' from source: facts 25052 1726882488.52587: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.52596: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.52621: variable 'network_provider' from source: set_fact 25052 1726882488.52632: variable 'ansible_facts' from source: unknown 25052 1726882488.52998: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 25052 1726882488.53002: when evaluation is False, skipping this task 25052 1726882488.53005: _execute() done 25052 1726882488.53007: dumping result to json 25052 1726882488.53009: done dumping result, returning 25052 1726882488.53018: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-f7f6-4a6d-000000000076] 25052 1726882488.53021: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000076 25052 1726882488.53116: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000076 25052 1726882488.53119: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 25052 1726882488.53165: no more pending results, returning what we have 25052 1726882488.53169: results queue empty 25052 1726882488.53170: checking for any_errors_fatal 25052 1726882488.53175: done checking for any_errors_fatal 25052 1726882488.53176: checking for max_fail_percentage 25052 1726882488.53178: done checking for max_fail_percentage 25052 1726882488.53179: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.53179: done checking to see if all hosts have failed 25052 1726882488.53180: getting the remaining hosts for this loop 25052 1726882488.53182: done getting the remaining hosts for this loop 25052 1726882488.53184: getting the next task for host managed_node2 25052 1726882488.53195: done getting next task for host managed_node2 25052 1726882488.53199: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25052 1726882488.53202: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.53219: getting variables 25052 1726882488.53220: in VariableManager get_vars() 25052 1726882488.53258: Calling all_inventory to load vars for managed_node2 25052 1726882488.53261: Calling groups_inventory to load vars for managed_node2 25052 1726882488.53263: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.53272: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.53275: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.53278: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.54635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.55498: done with get_vars() 25052 1726882488.55513: done getting variables 25052 1726882488.55555: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:34:48 -0400 (0:00:00.093) 0:00:25.510 ****** 25052 1726882488.55578: entering _queue_task() for managed_node2/package 25052 1726882488.55817: worker is 1 (out of 1 available) 25052 1726882488.55830: exiting _queue_task() for managed_node2/package 25052 1726882488.55842: done queuing things up, now waiting for results queue to drain 25052 1726882488.55843: waiting for pending results... 25052 1726882488.56028: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 25052 1726882488.56126: in run() - task 12673a56-9f93-f7f6-4a6d-000000000077 25052 1726882488.56137: variable 'ansible_search_path' from source: unknown 25052 1726882488.56141: variable 'ansible_search_path' from source: unknown 25052 1726882488.56168: calling self._execute() 25052 1726882488.56248: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.56252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.56260: variable 'omit' from source: magic vars 25052 1726882488.56999: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.57003: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.57005: variable 'network_state' from source: role '' defaults 25052 1726882488.57007: Evaluated conditional (network_state != {}): False 25052 1726882488.57009: when evaluation is False, skipping this task 25052 1726882488.57011: _execute() done 25052 1726882488.57013: dumping result to json 25052 1726882488.57015: done dumping result, returning 25052 1726882488.57016: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-f7f6-4a6d-000000000077] 25052 1726882488.57019: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000077 25052 1726882488.57081: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000077 25052 1726882488.57083: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882488.57121: no more pending results, returning what we have 25052 1726882488.57124: results queue empty 25052 1726882488.57125: checking for any_errors_fatal 25052 1726882488.57130: done checking for any_errors_fatal 25052 1726882488.57130: checking for max_fail_percentage 25052 1726882488.57132: done checking for max_fail_percentage 25052 1726882488.57132: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.57133: done checking to see if all hosts have failed 25052 1726882488.57134: getting the remaining hosts for this loop 25052 1726882488.57135: done getting the remaining hosts for this loop 25052 1726882488.57137: getting the next task for host managed_node2 25052 1726882488.57142: done getting next task for host managed_node2 25052 1726882488.57146: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25052 1726882488.57149: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.57164: getting variables 25052 1726882488.57165: in VariableManager get_vars() 25052 1726882488.57198: Calling all_inventory to load vars for managed_node2 25052 1726882488.57201: Calling groups_inventory to load vars for managed_node2 25052 1726882488.57203: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.57211: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.57214: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.57217: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.58461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.60014: done with get_vars() 25052 1726882488.60040: done getting variables 25052 1726882488.60101: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:34:48 -0400 (0:00:00.045) 0:00:25.556 ****** 25052 1726882488.60137: entering _queue_task() for managed_node2/package 25052 1726882488.60487: worker is 1 (out of 1 available) 25052 1726882488.60702: exiting _queue_task() for managed_node2/package 25052 1726882488.60713: done queuing things up, now waiting for results queue to drain 25052 1726882488.60715: waiting for pending results... 25052 1726882488.60912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 25052 1726882488.61048: in run() - task 12673a56-9f93-f7f6-4a6d-000000000078 25052 1726882488.61052: variable 'ansible_search_path' from source: unknown 25052 1726882488.61054: variable 'ansible_search_path' from source: unknown 25052 1726882488.61056: calling self._execute() 25052 1726882488.61091: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.61104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.61119: variable 'omit' from source: magic vars 25052 1726882488.61487: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.61505: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.61628: variable 'network_state' from source: role '' defaults 25052 1726882488.61644: Evaluated conditional (network_state != {}): False 25052 1726882488.61652: when evaluation is False, skipping this task 25052 1726882488.61658: _execute() done 25052 1726882488.61664: dumping result to json 25052 1726882488.61669: done dumping result, returning 25052 1726882488.61679: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-f7f6-4a6d-000000000078] 25052 1726882488.61686: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000078 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882488.61846: no more pending results, returning what we have 25052 1726882488.61850: results queue empty 25052 1726882488.61851: checking for any_errors_fatal 25052 1726882488.61858: done checking for any_errors_fatal 25052 1726882488.61858: checking for max_fail_percentage 25052 1726882488.61860: done checking for max_fail_percentage 25052 1726882488.61861: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.61862: done checking to see if all hosts have failed 25052 1726882488.61862: getting the remaining hosts for this loop 25052 1726882488.61864: done getting the remaining hosts for this loop 25052 1726882488.61868: getting the next task for host managed_node2 25052 1726882488.61874: done getting next task for host managed_node2 25052 1726882488.61878: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25052 1726882488.61882: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.61902: getting variables 25052 1726882488.61904: in VariableManager get_vars() 25052 1726882488.61944: Calling all_inventory to load vars for managed_node2 25052 1726882488.61946: Calling groups_inventory to load vars for managed_node2 25052 1726882488.61949: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.61961: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.61964: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.61968: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.62843: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000078 25052 1726882488.62846: WORKER PROCESS EXITING 25052 1726882488.63700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.65172: done with get_vars() 25052 1726882488.65196: done getting variables 25052 1726882488.65250: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:34:48 -0400 (0:00:00.051) 0:00:25.607 ****** 25052 1726882488.65281: entering _queue_task() for managed_node2/service 25052 1726882488.65807: worker is 1 (out of 1 available) 25052 1726882488.65816: exiting _queue_task() for managed_node2/service 25052 1726882488.65825: done queuing things up, now waiting for results queue to drain 25052 1726882488.65826: waiting for pending results... 25052 1726882488.65912: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 25052 1726882488.66070: in run() - task 12673a56-9f93-f7f6-4a6d-000000000079 25052 1726882488.66088: variable 'ansible_search_path' from source: unknown 25052 1726882488.66098: variable 'ansible_search_path' from source: unknown 25052 1726882488.66137: calling self._execute() 25052 1726882488.66241: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.66251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.66275: variable 'omit' from source: magic vars 25052 1726882488.66681: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.66704: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.67081: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.67254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.69613: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.69875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.69921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.69989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.70025: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.70114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.70151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.70235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.70379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.70383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.70385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.70412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.70444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.70496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.70516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.70561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.70592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.70798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.70801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.70803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.71157: variable 'network_connections' from source: task vars 25052 1726882488.71175: variable 'interface' from source: play vars 25052 1726882488.71265: variable 'interface' from source: play vars 25052 1726882488.71342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.71583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.71801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.71804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.71814: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.71949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.71975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.72006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.72130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.72184: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882488.72728: variable 'network_connections' from source: task vars 25052 1726882488.72738: variable 'interface' from source: play vars 25052 1726882488.72799: variable 'interface' from source: play vars 25052 1726882488.72930: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 25052 1726882488.72937: when evaluation is False, skipping this task 25052 1726882488.72943: _execute() done 25052 1726882488.72949: dumping result to json 25052 1726882488.72955: done dumping result, returning 25052 1726882488.72965: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f7f6-4a6d-000000000079] 25052 1726882488.72972: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000079 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 25052 1726882488.73186: no more pending results, returning what we have 25052 1726882488.73190: results queue empty 25052 1726882488.73191: checking for any_errors_fatal 25052 1726882488.73199: done checking for any_errors_fatal 25052 1726882488.73200: checking for max_fail_percentage 25052 1726882488.73202: done checking for max_fail_percentage 25052 1726882488.73203: checking to see if all hosts have failed and the running result is not ok 25052 1726882488.73204: done checking to see if all hosts have failed 25052 1726882488.73205: getting the remaining hosts for this loop 25052 1726882488.73206: done getting the remaining hosts for this loop 25052 1726882488.73210: getting the next task for host managed_node2 25052 1726882488.73219: done getting next task for host managed_node2 25052 1726882488.73223: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25052 1726882488.73226: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882488.73245: getting variables 25052 1726882488.73247: in VariableManager get_vars() 25052 1726882488.73289: Calling all_inventory to load vars for managed_node2 25052 1726882488.73291: Calling groups_inventory to load vars for managed_node2 25052 1726882488.73699: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882488.73710: Calling all_plugins_play to load vars for managed_node2 25052 1726882488.73714: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882488.73717: Calling groups_plugins_play to load vars for managed_node2 25052 1726882488.74581: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000079 25052 1726882488.74585: WORKER PROCESS EXITING 25052 1726882488.78135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882488.80436: done with get_vars() 25052 1726882488.80460: done getting variables 25052 1726882488.80529: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:34:48 -0400 (0:00:00.152) 0:00:25.760 ****** 25052 1726882488.80567: entering _queue_task() for managed_node2/service 25052 1726882488.80943: worker is 1 (out of 1 available) 25052 1726882488.80957: exiting _queue_task() for managed_node2/service 25052 1726882488.80969: done queuing things up, now waiting for results queue to drain 25052 1726882488.80971: waiting for pending results... 25052 1726882488.81278: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 25052 1726882488.81403: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007a 25052 1726882488.81417: variable 'ansible_search_path' from source: unknown 25052 1726882488.81421: variable 'ansible_search_path' from source: unknown 25052 1726882488.81463: calling self._execute() 25052 1726882488.81565: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.81569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.81600: variable 'omit' from source: magic vars 25052 1726882488.81961: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.81980: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882488.82154: variable 'network_provider' from source: set_fact 25052 1726882488.82157: variable 'network_state' from source: role '' defaults 25052 1726882488.82168: Evaluated conditional (network_provider == "nm" or network_state != {}): True 25052 1726882488.82174: variable 'omit' from source: magic vars 25052 1726882488.82238: variable 'omit' from source: magic vars 25052 1726882488.82267: variable 'network_service_name' from source: role '' defaults 25052 1726882488.82341: variable 'network_service_name' from source: role '' defaults 25052 1726882488.82600: variable '__network_provider_setup' from source: role '' defaults 25052 1726882488.82603: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882488.82606: variable '__network_service_name_default_nm' from source: role '' defaults 25052 1726882488.82608: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882488.82610: variable '__network_packages_default_nm' from source: role '' defaults 25052 1726882488.82901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882488.85008: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882488.85087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882488.85126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882488.85165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882488.85191: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882488.85277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.85311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.85335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.85381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.85399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.85443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.85474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.85500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.85541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.85555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.85810: variable '__network_packages_default_gobject_packages' from source: role '' defaults 25052 1726882488.86099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.86102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.86105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.86107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.86109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.86167: variable 'ansible_python' from source: facts 25052 1726882488.86243: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 25052 1726882488.86263: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882488.86349: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882488.86480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.86507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.86531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.86575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.86589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.86637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882488.86666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882488.86691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.86732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882488.86746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882488.87225: variable 'network_connections' from source: task vars 25052 1726882488.87228: variable 'interface' from source: play vars 25052 1726882488.87230: variable 'interface' from source: play vars 25052 1726882488.87232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882488.87307: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882488.87364: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882488.87440: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882488.87510: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882488.87569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882488.87595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882488.87630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882488.87657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882488.87917: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.88424: variable 'network_connections' from source: task vars 25052 1726882488.88431: variable 'interface' from source: play vars 25052 1726882488.88714: variable 'interface' from source: play vars 25052 1726882488.88748: variable '__network_packages_default_wireless' from source: role '' defaults 25052 1726882488.88916: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882488.89526: variable 'network_connections' from source: task vars 25052 1726882488.89532: variable 'interface' from source: play vars 25052 1726882488.89719: variable 'interface' from source: play vars 25052 1726882488.89743: variable '__network_packages_default_team' from source: role '' defaults 25052 1726882488.89936: variable '__network_team_connections_defined' from source: role '' defaults 25052 1726882488.90541: variable 'network_connections' from source: task vars 25052 1726882488.90544: variable 'interface' from source: play vars 25052 1726882488.90733: variable 'interface' from source: play vars 25052 1726882488.90904: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882488.90962: variable '__network_service_name_default_initscripts' from source: role '' defaults 25052 1726882488.90969: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882488.91143: variable '__network_packages_default_initscripts' from source: role '' defaults 25052 1726882488.91601: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 25052 1726882488.92776: variable 'network_connections' from source: task vars 25052 1726882488.92780: variable 'interface' from source: play vars 25052 1726882488.92878: variable 'interface' from source: play vars 25052 1726882488.92886: variable 'ansible_distribution' from source: facts 25052 1726882488.92888: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.92903: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.92917: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 25052 1726882488.93242: variable 'ansible_distribution' from source: facts 25052 1726882488.93245: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.93250: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.93264: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 25052 1726882488.93664: variable 'ansible_distribution' from source: facts 25052 1726882488.93667: variable '__network_rh_distros' from source: role '' defaults 25052 1726882488.93673: variable 'ansible_distribution_major_version' from source: facts 25052 1726882488.93998: variable 'network_provider' from source: set_fact 25052 1726882488.94001: variable 'omit' from source: magic vars 25052 1726882488.94003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882488.94006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882488.94008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882488.94010: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882488.94012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882488.94014: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882488.94016: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.94017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.94198: Set connection var ansible_pipelining to False 25052 1726882488.94201: Set connection var ansible_connection to ssh 25052 1726882488.94203: Set connection var ansible_shell_type to sh 25052 1726882488.94205: Set connection var ansible_timeout to 10 25052 1726882488.94207: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882488.94209: Set connection var ansible_shell_executable to /bin/sh 25052 1726882488.94211: variable 'ansible_shell_executable' from source: unknown 25052 1726882488.94213: variable 'ansible_connection' from source: unknown 25052 1726882488.94215: variable 'ansible_module_compression' from source: unknown 25052 1726882488.94217: variable 'ansible_shell_type' from source: unknown 25052 1726882488.94219: variable 'ansible_shell_executable' from source: unknown 25052 1726882488.94221: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882488.94223: variable 'ansible_pipelining' from source: unknown 25052 1726882488.94225: variable 'ansible_timeout' from source: unknown 25052 1726882488.94227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882488.94313: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882488.94322: variable 'omit' from source: magic vars 25052 1726882488.94330: starting attempt loop 25052 1726882488.94332: running the handler 25052 1726882488.94699: variable 'ansible_facts' from source: unknown 25052 1726882488.95292: _low_level_execute_command(): starting 25052 1726882488.95305: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882488.96216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882488.96259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882488.96328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882488.96359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882488.96827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882488.98539: stdout chunk (state=3): >>>/root <<< 25052 1726882488.98745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882488.98998: stdout chunk (state=3): >>><<< 25052 1726882488.99001: stderr chunk (state=3): >>><<< 25052 1726882488.99004: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882488.99006: _low_level_execute_command(): starting 25052 1726882488.99009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289 `" && echo ansible-tmp-1726882488.9878018-26270-56892767888289="` echo /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289 `" ) && sleep 0' 25052 1726882489.00008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882489.00086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882489.00116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882489.00169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882489.00172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882489.00242: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882489.00343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882489.00355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882489.00366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882489.00691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882489.02595: stdout chunk (state=3): >>>ansible-tmp-1726882488.9878018-26270-56892767888289=/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289 <<< 25052 1726882489.02745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882489.02748: stdout chunk (state=3): >>><<< 25052 1726882489.02898: stderr chunk (state=3): >>><<< 25052 1726882489.02902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882488.9878018-26270-56892767888289=/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882489.02904: variable 'ansible_module_compression' from source: unknown 25052 1726882489.02953: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 25052 1726882489.03138: variable 'ansible_facts' from source: unknown 25052 1726882489.03473: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py 25052 1726882489.03880: Sending initial data 25052 1726882489.03884: Sent initial data (155 bytes) 25052 1726882489.04803: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882489.04807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882489.05138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882489.05141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882489.05178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882489.05313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882489.07032: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25052 1726882489.07037: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882489.07099: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882489.07161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp58s0il_0 /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py <<< 25052 1726882489.07165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py" <<< 25052 1726882489.07502: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp58s0il_0" to remote "/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py" <<< 25052 1726882489.10907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882489.11087: stderr chunk (state=3): >>><<< 25052 1726882489.11090: stdout chunk (state=3): >>><<< 25052 1726882489.11112: done transferring module to remote 25052 1726882489.11124: _low_level_execute_command(): starting 25052 1726882489.11135: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/ /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py && sleep 0' 25052 1726882489.12267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882489.12322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882489.12340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882489.12353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882489.12700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882489.12704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882489.12868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882489.12997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882489.14735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882489.14739: stdout chunk (state=3): >>><<< 25052 1726882489.14746: stderr chunk (state=3): >>><<< 25052 1726882489.14957: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882489.14961: _low_level_execute_command(): starting 25052 1726882489.14964: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/AnsiballZ_systemd.py && sleep 0' 25052 1726882489.16599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882489.16602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882489.16898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882489.16902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882489.16906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882489.16921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882489.17020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882489.45719: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314741248", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1168574000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 25052 1726882489.47466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882489.47476: stdout chunk (state=3): >>><<< 25052 1726882489.47486: stderr chunk (state=3): >>><<< 25052 1726882489.47518: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4583424", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314741248", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1168574000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882489.47885: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882489.48019: _low_level_execute_command(): starting 25052 1726882489.48029: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882488.9878018-26270-56892767888289/ > /dev/null 2>&1 && sleep 0' 25052 1726882489.49166: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882489.49397: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882489.49910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882489.50035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882489.51899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882489.51903: stdout chunk (state=3): >>><<< 25052 1726882489.51905: stderr chunk (state=3): >>><<< 25052 1726882489.51920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882489.51932: handler run complete 25052 1726882489.52006: attempt loop complete, returning result 25052 1726882489.52301: _execute() done 25052 1726882489.52304: dumping result to json 25052 1726882489.52306: done dumping result, returning 25052 1726882489.52308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-f7f6-4a6d-00000000007a] 25052 1726882489.52310: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007a ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882489.52722: no more pending results, returning what we have 25052 1726882489.52727: results queue empty 25052 1726882489.52728: checking for any_errors_fatal 25052 1726882489.52736: done checking for any_errors_fatal 25052 1726882489.52737: checking for max_fail_percentage 25052 1726882489.52739: done checking for max_fail_percentage 25052 1726882489.52740: checking to see if all hosts have failed and the running result is not ok 25052 1726882489.52741: done checking to see if all hosts have failed 25052 1726882489.52741: getting the remaining hosts for this loop 25052 1726882489.52743: done getting the remaining hosts for this loop 25052 1726882489.52746: getting the next task for host managed_node2 25052 1726882489.52752: done getting next task for host managed_node2 25052 1726882489.52756: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25052 1726882489.52759: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882489.52776: getting variables 25052 1726882489.52778: in VariableManager get_vars() 25052 1726882489.52820: Calling all_inventory to load vars for managed_node2 25052 1726882489.52823: Calling groups_inventory to load vars for managed_node2 25052 1726882489.52825: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882489.52835: Calling all_plugins_play to load vars for managed_node2 25052 1726882489.52839: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882489.52842: Calling groups_plugins_play to load vars for managed_node2 25052 1726882489.53703: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007a 25052 1726882489.53707: WORKER PROCESS EXITING 25052 1726882489.56087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882489.59496: done with get_vars() 25052 1726882489.59530: done getting variables 25052 1726882489.59590: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:34:49 -0400 (0:00:00.791) 0:00:26.552 ****** 25052 1726882489.59757: entering _queue_task() for managed_node2/service 25052 1726882489.60461: worker is 1 (out of 1 available) 25052 1726882489.60473: exiting _queue_task() for managed_node2/service 25052 1726882489.60601: done queuing things up, now waiting for results queue to drain 25052 1726882489.60603: waiting for pending results... 25052 1726882489.61148: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 25052 1726882489.61267: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007b 25052 1726882489.61284: variable 'ansible_search_path' from source: unknown 25052 1726882489.61288: variable 'ansible_search_path' from source: unknown 25052 1726882489.61789: calling self._execute() 25052 1726882489.62020: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882489.62028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882489.62032: variable 'omit' from source: magic vars 25052 1726882489.62957: variable 'ansible_distribution_major_version' from source: facts 25052 1726882489.62961: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882489.63206: variable 'network_provider' from source: set_fact 25052 1726882489.63210: Evaluated conditional (network_provider == "nm"): True 25052 1726882489.63325: variable '__network_wpa_supplicant_required' from source: role '' defaults 25052 1726882489.63416: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 25052 1726882489.63783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882489.69040: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882489.69314: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882489.69350: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882489.69384: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882489.69414: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882489.69486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882489.69720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882489.69744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882489.69781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882489.69801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882489.69843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882489.69864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882489.69886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882489.70201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882489.70205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882489.70207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882489.70210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882489.70229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882489.70310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882489.70320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882489.70617: variable 'network_connections' from source: task vars 25052 1726882489.70631: variable 'interface' from source: play vars 25052 1726882489.70975: variable 'interface' from source: play vars 25052 1726882489.70978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882489.71256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882489.71301: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882489.71431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882489.71626: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882489.71689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 25052 1726882489.71720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 25052 1726882489.71747: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882489.71773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 25052 1726882489.72025: variable '__network_wireless_connections_defined' from source: role '' defaults 25052 1726882489.72473: variable 'network_connections' from source: task vars 25052 1726882489.72476: variable 'interface' from source: play vars 25052 1726882489.72543: variable 'interface' from source: play vars 25052 1726882489.72572: Evaluated conditional (__network_wpa_supplicant_required): False 25052 1726882489.72576: when evaluation is False, skipping this task 25052 1726882489.72578: _execute() done 25052 1726882489.72580: dumping result to json 25052 1726882489.72583: done dumping result, returning 25052 1726882489.72650: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-f7f6-4a6d-00000000007b] 25052 1726882489.72807: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007b 25052 1726882489.73118: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007b 25052 1726882489.73122: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 25052 1726882489.73173: no more pending results, returning what we have 25052 1726882489.73177: results queue empty 25052 1726882489.73178: checking for any_errors_fatal 25052 1726882489.73201: done checking for any_errors_fatal 25052 1726882489.73203: checking for max_fail_percentage 25052 1726882489.73204: done checking for max_fail_percentage 25052 1726882489.73205: checking to see if all hosts have failed and the running result is not ok 25052 1726882489.73206: done checking to see if all hosts have failed 25052 1726882489.73207: getting the remaining hosts for this loop 25052 1726882489.73208: done getting the remaining hosts for this loop 25052 1726882489.73211: getting the next task for host managed_node2 25052 1726882489.73219: done getting next task for host managed_node2 25052 1726882489.73223: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 25052 1726882489.73227: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882489.73248: getting variables 25052 1726882489.73250: in VariableManager get_vars() 25052 1726882489.73286: Calling all_inventory to load vars for managed_node2 25052 1726882489.73289: Calling groups_inventory to load vars for managed_node2 25052 1726882489.73291: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882489.73351: Calling all_plugins_play to load vars for managed_node2 25052 1726882489.73355: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882489.73357: Calling groups_plugins_play to load vars for managed_node2 25052 1726882489.76317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882489.78999: done with get_vars() 25052 1726882489.79026: done getting variables 25052 1726882489.79095: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:34:49 -0400 (0:00:00.193) 0:00:26.746 ****** 25052 1726882489.79129: entering _queue_task() for managed_node2/service 25052 1726882489.79490: worker is 1 (out of 1 available) 25052 1726882489.79512: exiting _queue_task() for managed_node2/service 25052 1726882489.79524: done queuing things up, now waiting for results queue to drain 25052 1726882489.79525: waiting for pending results... 25052 1726882489.79755: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 25052 1726882489.79895: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007c 25052 1726882489.79921: variable 'ansible_search_path' from source: unknown 25052 1726882489.79930: variable 'ansible_search_path' from source: unknown 25052 1726882489.79968: calling self._execute() 25052 1726882489.80065: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882489.80076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882489.80090: variable 'omit' from source: magic vars 25052 1726882489.80502: variable 'ansible_distribution_major_version' from source: facts 25052 1726882489.80564: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882489.80796: variable 'network_provider' from source: set_fact 25052 1726882489.80800: Evaluated conditional (network_provider == "initscripts"): False 25052 1726882489.80803: when evaluation is False, skipping this task 25052 1726882489.80806: _execute() done 25052 1726882489.80808: dumping result to json 25052 1726882489.80810: done dumping result, returning 25052 1726882489.80813: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-f7f6-4a6d-00000000007c] 25052 1726882489.80815: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007c skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 25052 1726882489.81023: no more pending results, returning what we have 25052 1726882489.81027: results queue empty 25052 1726882489.81028: checking for any_errors_fatal 25052 1726882489.81036: done checking for any_errors_fatal 25052 1726882489.81037: checking for max_fail_percentage 25052 1726882489.81038: done checking for max_fail_percentage 25052 1726882489.81039: checking to see if all hosts have failed and the running result is not ok 25052 1726882489.81040: done checking to see if all hosts have failed 25052 1726882489.81040: getting the remaining hosts for this loop 25052 1726882489.81042: done getting the remaining hosts for this loop 25052 1726882489.81045: getting the next task for host managed_node2 25052 1726882489.81051: done getting next task for host managed_node2 25052 1726882489.81054: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25052 1726882489.81058: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882489.81076: getting variables 25052 1726882489.81077: in VariableManager get_vars() 25052 1726882489.81119: Calling all_inventory to load vars for managed_node2 25052 1726882489.81122: Calling groups_inventory to load vars for managed_node2 25052 1726882489.81124: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882489.81138: Calling all_plugins_play to load vars for managed_node2 25052 1726882489.81142: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882489.81144: Calling groups_plugins_play to load vars for managed_node2 25052 1726882489.82318: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007c 25052 1726882489.82322: WORKER PROCESS EXITING 25052 1726882489.82874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882489.85517: done with get_vars() 25052 1726882489.85537: done getting variables 25052 1726882489.85595: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:34:49 -0400 (0:00:00.066) 0:00:26.812 ****** 25052 1726882489.85742: entering _queue_task() for managed_node2/copy 25052 1726882489.86479: worker is 1 (out of 1 available) 25052 1726882489.86491: exiting _queue_task() for managed_node2/copy 25052 1726882489.86501: done queuing things up, now waiting for results queue to drain 25052 1726882489.86502: waiting for pending results... 25052 1726882489.86898: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 25052 1726882489.87236: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007d 25052 1726882489.87251: variable 'ansible_search_path' from source: unknown 25052 1726882489.87255: variable 'ansible_search_path' from source: unknown 25052 1726882489.87289: calling self._execute() 25052 1726882489.87578: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882489.87582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882489.87597: variable 'omit' from source: magic vars 25052 1726882489.88432: variable 'ansible_distribution_major_version' from source: facts 25052 1726882489.88600: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882489.88800: variable 'network_provider' from source: set_fact 25052 1726882489.88804: Evaluated conditional (network_provider == "initscripts"): False 25052 1726882489.88806: when evaluation is False, skipping this task 25052 1726882489.88808: _execute() done 25052 1726882489.88810: dumping result to json 25052 1726882489.88812: done dumping result, returning 25052 1726882489.88815: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-f7f6-4a6d-00000000007d] 25052 1726882489.88817: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007d skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 25052 1726882489.89137: no more pending results, returning what we have 25052 1726882489.89140: results queue empty 25052 1726882489.89141: checking for any_errors_fatal 25052 1726882489.89148: done checking for any_errors_fatal 25052 1726882489.89148: checking for max_fail_percentage 25052 1726882489.89150: done checking for max_fail_percentage 25052 1726882489.89151: checking to see if all hosts have failed and the running result is not ok 25052 1726882489.89152: done checking to see if all hosts have failed 25052 1726882489.89152: getting the remaining hosts for this loop 25052 1726882489.89154: done getting the remaining hosts for this loop 25052 1726882489.89156: getting the next task for host managed_node2 25052 1726882489.89164: done getting next task for host managed_node2 25052 1726882489.89167: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25052 1726882489.89171: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882489.89189: getting variables 25052 1726882489.89191: in VariableManager get_vars() 25052 1726882489.89232: Calling all_inventory to load vars for managed_node2 25052 1726882489.89235: Calling groups_inventory to load vars for managed_node2 25052 1726882489.89237: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882489.89247: Calling all_plugins_play to load vars for managed_node2 25052 1726882489.89251: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882489.89254: Calling groups_plugins_play to load vars for managed_node2 25052 1726882489.89976: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007d 25052 1726882489.89980: WORKER PROCESS EXITING 25052 1726882489.92517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882489.96309: done with get_vars() 25052 1726882489.96341: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:34:49 -0400 (0:00:00.106) 0:00:26.919 ****** 25052 1726882489.96433: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25052 1726882489.97088: worker is 1 (out of 1 available) 25052 1726882489.97103: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 25052 1726882489.97115: done queuing things up, now waiting for results queue to drain 25052 1726882489.97116: waiting for pending results... 25052 1726882489.97617: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 25052 1726882489.97940: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007e 25052 1726882489.98136: variable 'ansible_search_path' from source: unknown 25052 1726882489.98139: variable 'ansible_search_path' from source: unknown 25052 1726882489.98142: calling self._execute() 25052 1726882489.98344: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882489.98358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882489.98374: variable 'omit' from source: magic vars 25052 1726882489.98939: variable 'ansible_distribution_major_version' from source: facts 25052 1726882489.99124: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882489.99137: variable 'omit' from source: magic vars 25052 1726882489.99199: variable 'omit' from source: magic vars 25052 1726882489.99607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882490.04002: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882490.04006: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882490.04050: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882490.04091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882490.04131: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882490.04344: variable 'network_provider' from source: set_fact 25052 1726882490.04553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882490.04657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882490.04689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882490.04802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882490.04823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882490.04944: variable 'omit' from source: magic vars 25052 1726882490.05399: variable 'omit' from source: magic vars 25052 1726882490.05632: variable 'network_connections' from source: task vars 25052 1726882490.05635: variable 'interface' from source: play vars 25052 1726882490.05637: variable 'interface' from source: play vars 25052 1726882490.05868: variable 'omit' from source: magic vars 25052 1726882490.05969: variable '__lsr_ansible_managed' from source: task vars 25052 1726882490.06035: variable '__lsr_ansible_managed' from source: task vars 25052 1726882490.07300: Loaded config def from plugin (lookup/template) 25052 1726882490.07311: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 25052 1726882490.07344: File lookup term: get_ansible_managed.j2 25052 1726882490.07352: variable 'ansible_search_path' from source: unknown 25052 1726882490.07361: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 25052 1726882490.07383: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 25052 1726882490.07500: variable 'ansible_search_path' from source: unknown 25052 1726882490.19912: variable 'ansible_managed' from source: unknown 25052 1726882490.20418: variable 'omit' from source: magic vars 25052 1726882490.20514: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882490.20574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882490.20722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882490.21058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.21061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.21064: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882490.21066: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.21069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.21500: Set connection var ansible_pipelining to False 25052 1726882490.21503: Set connection var ansible_connection to ssh 25052 1726882490.21505: Set connection var ansible_shell_type to sh 25052 1726882490.21507: Set connection var ansible_timeout to 10 25052 1726882490.21509: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882490.21511: Set connection var ansible_shell_executable to /bin/sh 25052 1726882490.21513: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.21515: variable 'ansible_connection' from source: unknown 25052 1726882490.21517: variable 'ansible_module_compression' from source: unknown 25052 1726882490.21520: variable 'ansible_shell_type' from source: unknown 25052 1726882490.21522: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.21524: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.21527: variable 'ansible_pipelining' from source: unknown 25052 1726882490.21529: variable 'ansible_timeout' from source: unknown 25052 1726882490.21531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.22044: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882490.22056: variable 'omit' from source: magic vars 25052 1726882490.22059: starting attempt loop 25052 1726882490.22061: running the handler 25052 1726882490.22076: _low_level_execute_command(): starting 25052 1726882490.22082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882490.23558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882490.23562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882490.23679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.23683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882490.23701: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.23707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882490.23723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882490.23728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.23800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882490.23803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882490.24106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.24157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.25948: stdout chunk (state=3): >>>/root <<< 25052 1726882490.25951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882490.25998: stderr chunk (state=3): >>><<< 25052 1726882490.26001: stdout chunk (state=3): >>><<< 25052 1726882490.26025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882490.26406: _low_level_execute_command(): starting 25052 1726882490.26413: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660 `" && echo ansible-tmp-1726882490.2602925-26310-121805737782660="` echo /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660 `" ) && sleep 0' 25052 1726882490.27669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.27673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882490.27699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882490.27703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882490.27705: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.27719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882490.27725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882490.27730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.27909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882490.27973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.28062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.29958: stdout chunk (state=3): >>>ansible-tmp-1726882490.2602925-26310-121805737782660=/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660 <<< 25052 1726882490.30062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882490.30106: stderr chunk (state=3): >>><<< 25052 1726882490.30115: stdout chunk (state=3): >>><<< 25052 1726882490.30136: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882490.2602925-26310-121805737782660=/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882490.30333: variable 'ansible_module_compression' from source: unknown 25052 1726882490.30379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 25052 1726882490.30643: variable 'ansible_facts' from source: unknown 25052 1726882490.31163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py 25052 1726882490.31856: Sending initial data 25052 1726882490.31859: Sent initial data (168 bytes) 25052 1726882490.32875: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882490.33118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882490.33362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.33572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.35101: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882490.35154: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882490.35265: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmplez9f0aw /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py <<< 25052 1726882490.35268: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py" <<< 25052 1726882490.35324: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmplez9f0aw" to remote "/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py" <<< 25052 1726882490.37595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882490.37608: stdout chunk (state=3): >>><<< 25052 1726882490.37621: stderr chunk (state=3): >>><<< 25052 1726882490.38005: done transferring module to remote 25052 1726882490.38009: _low_level_execute_command(): starting 25052 1726882490.38012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/ /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py && sleep 0' 25052 1726882490.39260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882490.39263: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.39265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882490.39267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882490.39269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.39271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.39273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882490.39275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.39408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882490.39511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.39584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.41491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882490.41497: stdout chunk (state=3): >>><<< 25052 1726882490.41500: stderr chunk (state=3): >>><<< 25052 1726882490.41503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882490.41505: _low_level_execute_command(): starting 25052 1726882490.41507: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/AnsiballZ_network_connections.py && sleep 0' 25052 1726882490.42638: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882490.42671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.42686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882490.42705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882490.42811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.43042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.43138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.74356: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uuep457z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uuep457z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/55dc2a1c-03d2-45b8-a3e7-a9c369c581cc: error=unknown <<< 25052 1726882490.74651: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 25052 1726882490.76400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882490.76404: stdout chunk (state=3): >>><<< 25052 1726882490.76406: stderr chunk (state=3): >>><<< 25052 1726882490.76408: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uuep457z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_uuep457z/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on veth0/55dc2a1c-03d2-45b8-a3e7-a9c369c581cc: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "veth0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882490.76449: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'veth0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882490.76482: _low_level_execute_command(): starting 25052 1726882490.76506: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882490.2602925-26310-121805737782660/ > /dev/null 2>&1 && sleep 0' 25052 1726882490.77276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882490.77306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882490.77408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882490.77429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882490.77444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882490.77466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882490.77571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882490.79413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882490.79426: stderr chunk (state=3): >>><<< 25052 1726882490.79429: stdout chunk (state=3): >>><<< 25052 1726882490.79445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882490.79452: handler run complete 25052 1726882490.79480: attempt loop complete, returning result 25052 1726882490.79483: _execute() done 25052 1726882490.79486: dumping result to json 25052 1726882490.79490: done dumping result, returning 25052 1726882490.79515: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-f7f6-4a6d-00000000007e] 25052 1726882490.79518: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007e 25052 1726882490.79620: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007e 25052 1726882490.79623: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 25052 1726882490.79729: no more pending results, returning what we have 25052 1726882490.79734: results queue empty 25052 1726882490.79735: checking for any_errors_fatal 25052 1726882490.79741: done checking for any_errors_fatal 25052 1726882490.79742: checking for max_fail_percentage 25052 1726882490.79744: done checking for max_fail_percentage 25052 1726882490.79745: checking to see if all hosts have failed and the running result is not ok 25052 1726882490.79746: done checking to see if all hosts have failed 25052 1726882490.79747: getting the remaining hosts for this loop 25052 1726882490.79748: done getting the remaining hosts for this loop 25052 1726882490.79751: getting the next task for host managed_node2 25052 1726882490.79756: done getting next task for host managed_node2 25052 1726882490.79760: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 25052 1726882490.79762: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882490.79777: getting variables 25052 1726882490.79780: in VariableManager get_vars() 25052 1726882490.79829: Calling all_inventory to load vars for managed_node2 25052 1726882490.79832: Calling groups_inventory to load vars for managed_node2 25052 1726882490.79834: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882490.79843: Calling all_plugins_play to load vars for managed_node2 25052 1726882490.79845: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882490.79847: Calling groups_plugins_play to load vars for managed_node2 25052 1726882490.80869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882490.81930: done with get_vars() 25052 1726882490.81957: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:34:50 -0400 (0:00:00.856) 0:00:27.775 ****** 25052 1726882490.82039: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25052 1726882490.82309: worker is 1 (out of 1 available) 25052 1726882490.82325: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 25052 1726882490.82338: done queuing things up, now waiting for results queue to drain 25052 1726882490.82339: waiting for pending results... 25052 1726882490.82564: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 25052 1726882490.82664: in run() - task 12673a56-9f93-f7f6-4a6d-00000000007f 25052 1726882490.82687: variable 'ansible_search_path' from source: unknown 25052 1726882490.82691: variable 'ansible_search_path' from source: unknown 25052 1726882490.82735: calling self._execute() 25052 1726882490.82856: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.82864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.82867: variable 'omit' from source: magic vars 25052 1726882490.83500: variable 'ansible_distribution_major_version' from source: facts 25052 1726882490.83503: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882490.83650: variable 'network_state' from source: role '' defaults 25052 1726882490.83665: Evaluated conditional (network_state != {}): False 25052 1726882490.83728: when evaluation is False, skipping this task 25052 1726882490.83733: _execute() done 25052 1726882490.83736: dumping result to json 25052 1726882490.83740: done dumping result, returning 25052 1726882490.83743: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-f7f6-4a6d-00000000007f] 25052 1726882490.83749: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007f skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 25052 1726882490.83980: no more pending results, returning what we have 25052 1726882490.83984: results queue empty 25052 1726882490.83984: checking for any_errors_fatal 25052 1726882490.83999: done checking for any_errors_fatal 25052 1726882490.84000: checking for max_fail_percentage 25052 1726882490.84001: done checking for max_fail_percentage 25052 1726882490.84002: checking to see if all hosts have failed and the running result is not ok 25052 1726882490.84003: done checking to see if all hosts have failed 25052 1726882490.84004: getting the remaining hosts for this loop 25052 1726882490.84005: done getting the remaining hosts for this loop 25052 1726882490.84008: getting the next task for host managed_node2 25052 1726882490.84100: done getting next task for host managed_node2 25052 1726882490.84104: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25052 1726882490.84108: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882490.84142: getting variables 25052 1726882490.84143: in VariableManager get_vars() 25052 1726882490.84177: Calling all_inventory to load vars for managed_node2 25052 1726882490.84179: Calling groups_inventory to load vars for managed_node2 25052 1726882490.84181: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882490.84189: Calling all_plugins_play to load vars for managed_node2 25052 1726882490.84192: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882490.84197: Calling groups_plugins_play to load vars for managed_node2 25052 1726882490.84724: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000007f 25052 1726882490.84729: WORKER PROCESS EXITING 25052 1726882490.85318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882490.86306: done with get_vars() 25052 1726882490.86321: done getting variables 25052 1726882490.86363: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:34:50 -0400 (0:00:00.043) 0:00:27.818 ****** 25052 1726882490.86385: entering _queue_task() for managed_node2/debug 25052 1726882490.86600: worker is 1 (out of 1 available) 25052 1726882490.86613: exiting _queue_task() for managed_node2/debug 25052 1726882490.86624: done queuing things up, now waiting for results queue to drain 25052 1726882490.86626: waiting for pending results... 25052 1726882490.86808: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 25052 1726882490.86896: in run() - task 12673a56-9f93-f7f6-4a6d-000000000080 25052 1726882490.86907: variable 'ansible_search_path' from source: unknown 25052 1726882490.86910: variable 'ansible_search_path' from source: unknown 25052 1726882490.86936: calling self._execute() 25052 1726882490.87011: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.87014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.87022: variable 'omit' from source: magic vars 25052 1726882490.87301: variable 'ansible_distribution_major_version' from source: facts 25052 1726882490.87307: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882490.87309: variable 'omit' from source: magic vars 25052 1726882490.87346: variable 'omit' from source: magic vars 25052 1726882490.87385: variable 'omit' from source: magic vars 25052 1726882490.87418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882490.87454: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882490.87469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882490.87482: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.87497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.87517: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882490.87520: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.87523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.87599: Set connection var ansible_pipelining to False 25052 1726882490.87603: Set connection var ansible_connection to ssh 25052 1726882490.87605: Set connection var ansible_shell_type to sh 25052 1726882490.87619: Set connection var ansible_timeout to 10 25052 1726882490.87623: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882490.87626: Set connection var ansible_shell_executable to /bin/sh 25052 1726882490.87655: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.87658: variable 'ansible_connection' from source: unknown 25052 1726882490.87661: variable 'ansible_module_compression' from source: unknown 25052 1726882490.87663: variable 'ansible_shell_type' from source: unknown 25052 1726882490.87665: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.87668: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.87669: variable 'ansible_pipelining' from source: unknown 25052 1726882490.87671: variable 'ansible_timeout' from source: unknown 25052 1726882490.87673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.87801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882490.87815: variable 'omit' from source: magic vars 25052 1726882490.87818: starting attempt loop 25052 1726882490.87821: running the handler 25052 1726882490.87946: variable '__network_connections_result' from source: set_fact 25052 1726882490.87985: handler run complete 25052 1726882490.88001: attempt loop complete, returning result 25052 1726882490.88004: _execute() done 25052 1726882490.88007: dumping result to json 25052 1726882490.88009: done dumping result, returning 25052 1726882490.88017: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-f7f6-4a6d-000000000080] 25052 1726882490.88021: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000080 25052 1726882490.88109: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000080 25052 1726882490.88112: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 25052 1726882490.88185: no more pending results, returning what we have 25052 1726882490.88189: results queue empty 25052 1726882490.88189: checking for any_errors_fatal 25052 1726882490.88200: done checking for any_errors_fatal 25052 1726882490.88201: checking for max_fail_percentage 25052 1726882490.88202: done checking for max_fail_percentage 25052 1726882490.88203: checking to see if all hosts have failed and the running result is not ok 25052 1726882490.88204: done checking to see if all hosts have failed 25052 1726882490.88205: getting the remaining hosts for this loop 25052 1726882490.88206: done getting the remaining hosts for this loop 25052 1726882490.88209: getting the next task for host managed_node2 25052 1726882490.88214: done getting next task for host managed_node2 25052 1726882490.88218: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25052 1726882490.88222: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882490.88234: getting variables 25052 1726882490.88236: in VariableManager get_vars() 25052 1726882490.88266: Calling all_inventory to load vars for managed_node2 25052 1726882490.88269: Calling groups_inventory to load vars for managed_node2 25052 1726882490.88271: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882490.88278: Calling all_plugins_play to load vars for managed_node2 25052 1726882490.88281: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882490.88285: Calling groups_plugins_play to load vars for managed_node2 25052 1726882490.89367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882490.90347: done with get_vars() 25052 1726882490.90362: done getting variables 25052 1726882490.90407: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:34:50 -0400 (0:00:00.040) 0:00:27.859 ****** 25052 1726882490.90434: entering _queue_task() for managed_node2/debug 25052 1726882490.90735: worker is 1 (out of 1 available) 25052 1726882490.90750: exiting _queue_task() for managed_node2/debug 25052 1726882490.90761: done queuing things up, now waiting for results queue to drain 25052 1726882490.90762: waiting for pending results... 25052 1726882490.91237: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 25052 1726882490.91246: in run() - task 12673a56-9f93-f7f6-4a6d-000000000081 25052 1726882490.91250: variable 'ansible_search_path' from source: unknown 25052 1726882490.91252: variable 'ansible_search_path' from source: unknown 25052 1726882490.91254: calling self._execute() 25052 1726882490.91280: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.91286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.91298: variable 'omit' from source: magic vars 25052 1726882490.91771: variable 'ansible_distribution_major_version' from source: facts 25052 1726882490.91778: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882490.91785: variable 'omit' from source: magic vars 25052 1726882490.91844: variable 'omit' from source: magic vars 25052 1726882490.91985: variable 'omit' from source: magic vars 25052 1726882490.91989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882490.91997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882490.92000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882490.92027: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.92032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882490.92064: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882490.92067: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.92069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.92205: Set connection var ansible_pipelining to False 25052 1726882490.92209: Set connection var ansible_connection to ssh 25052 1726882490.92211: Set connection var ansible_shell_type to sh 25052 1726882490.92213: Set connection var ansible_timeout to 10 25052 1726882490.92221: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882490.92226: Set connection var ansible_shell_executable to /bin/sh 25052 1726882490.92248: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.92251: variable 'ansible_connection' from source: unknown 25052 1726882490.92254: variable 'ansible_module_compression' from source: unknown 25052 1726882490.92256: variable 'ansible_shell_type' from source: unknown 25052 1726882490.92258: variable 'ansible_shell_executable' from source: unknown 25052 1726882490.92261: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.92307: variable 'ansible_pipelining' from source: unknown 25052 1726882490.92310: variable 'ansible_timeout' from source: unknown 25052 1726882490.92313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.92398: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882490.92414: variable 'omit' from source: magic vars 25052 1726882490.92417: starting attempt loop 25052 1726882490.92421: running the handler 25052 1726882490.92466: variable '__network_connections_result' from source: set_fact 25052 1726882490.92572: variable '__network_connections_result' from source: set_fact 25052 1726882490.92639: handler run complete 25052 1726882490.92664: attempt loop complete, returning result 25052 1726882490.92667: _execute() done 25052 1726882490.92670: dumping result to json 25052 1726882490.92741: done dumping result, returning 25052 1726882490.92744: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-f7f6-4a6d-000000000081] 25052 1726882490.92746: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000081 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "veth0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 25052 1726882490.92962: no more pending results, returning what we have 25052 1726882490.92965: results queue empty 25052 1726882490.92966: checking for any_errors_fatal 25052 1726882490.92971: done checking for any_errors_fatal 25052 1726882490.92971: checking for max_fail_percentage 25052 1726882490.92973: done checking for max_fail_percentage 25052 1726882490.92974: checking to see if all hosts have failed and the running result is not ok 25052 1726882490.92974: done checking to see if all hosts have failed 25052 1726882490.92975: getting the remaining hosts for this loop 25052 1726882490.92976: done getting the remaining hosts for this loop 25052 1726882490.92979: getting the next task for host managed_node2 25052 1726882490.92984: done getting next task for host managed_node2 25052 1726882490.92987: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25052 1726882490.92990: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882490.93004: getting variables 25052 1726882490.93006: in VariableManager get_vars() 25052 1726882490.93041: Calling all_inventory to load vars for managed_node2 25052 1726882490.93044: Calling groups_inventory to load vars for managed_node2 25052 1726882490.93047: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882490.93055: Calling all_plugins_play to load vars for managed_node2 25052 1726882490.93058: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882490.93061: Calling groups_plugins_play to load vars for managed_node2 25052 1726882490.93604: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000081 25052 1726882490.93607: WORKER PROCESS EXITING 25052 1726882490.93940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882490.94942: done with get_vars() 25052 1726882490.94961: done getting variables 25052 1726882490.95007: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:34:50 -0400 (0:00:00.046) 0:00:27.905 ****** 25052 1726882490.95035: entering _queue_task() for managed_node2/debug 25052 1726882490.95298: worker is 1 (out of 1 available) 25052 1726882490.95312: exiting _queue_task() for managed_node2/debug 25052 1726882490.95323: done queuing things up, now waiting for results queue to drain 25052 1726882490.95324: waiting for pending results... 25052 1726882490.95527: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 25052 1726882490.95721: in run() - task 12673a56-9f93-f7f6-4a6d-000000000082 25052 1726882490.95730: variable 'ansible_search_path' from source: unknown 25052 1726882490.95733: variable 'ansible_search_path' from source: unknown 25052 1726882490.95781: calling self._execute() 25052 1726882490.95957: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882490.95998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882490.96002: variable 'omit' from source: magic vars 25052 1726882490.96642: variable 'ansible_distribution_major_version' from source: facts 25052 1726882490.96826: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882490.97004: variable 'network_state' from source: role '' defaults 25052 1726882490.97082: Evaluated conditional (network_state != {}): False 25052 1726882490.97086: when evaluation is False, skipping this task 25052 1726882490.97089: _execute() done 25052 1726882490.97095: dumping result to json 25052 1726882490.97098: done dumping result, returning 25052 1726882490.97106: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-f7f6-4a6d-000000000082] 25052 1726882490.97110: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000082 25052 1726882490.97387: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000082 25052 1726882490.97390: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 25052 1726882490.97447: no more pending results, returning what we have 25052 1726882490.97452: results queue empty 25052 1726882490.97453: checking for any_errors_fatal 25052 1726882490.97470: done checking for any_errors_fatal 25052 1726882490.97472: checking for max_fail_percentage 25052 1726882490.97474: done checking for max_fail_percentage 25052 1726882490.97475: checking to see if all hosts have failed and the running result is not ok 25052 1726882490.97476: done checking to see if all hosts have failed 25052 1726882490.97476: getting the remaining hosts for this loop 25052 1726882490.97478: done getting the remaining hosts for this loop 25052 1726882490.97481: getting the next task for host managed_node2 25052 1726882490.97488: done getting next task for host managed_node2 25052 1726882490.97496: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 25052 1726882490.97500: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882490.97522: getting variables 25052 1726882490.97524: in VariableManager get_vars() 25052 1726882490.97573: Calling all_inventory to load vars for managed_node2 25052 1726882490.97575: Calling groups_inventory to load vars for managed_node2 25052 1726882490.97577: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882490.97589: Calling all_plugins_play to load vars for managed_node2 25052 1726882490.97937: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882490.97942: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.05387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.06917: done with get_vars() 25052 1726882491.06943: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:34:51 -0400 (0:00:00.119) 0:00:28.025 ****** 25052 1726882491.07031: entering _queue_task() for managed_node2/ping 25052 1726882491.07401: worker is 1 (out of 1 available) 25052 1726882491.07414: exiting _queue_task() for managed_node2/ping 25052 1726882491.07426: done queuing things up, now waiting for results queue to drain 25052 1726882491.07427: waiting for pending results... 25052 1726882491.07815: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 25052 1726882491.07853: in run() - task 12673a56-9f93-f7f6-4a6d-000000000083 25052 1726882491.07875: variable 'ansible_search_path' from source: unknown 25052 1726882491.07884: variable 'ansible_search_path' from source: unknown 25052 1726882491.07935: calling self._execute() 25052 1726882491.08043: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.08055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.08072: variable 'omit' from source: magic vars 25052 1726882491.08461: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.08477: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.08490: variable 'omit' from source: magic vars 25052 1726882491.08555: variable 'omit' from source: magic vars 25052 1726882491.08606: variable 'omit' from source: magic vars 25052 1726882491.08649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882491.08696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882491.08783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882491.08787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882491.08789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882491.08801: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882491.08810: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.08819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.08933: Set connection var ansible_pipelining to False 25052 1726882491.08942: Set connection var ansible_connection to ssh 25052 1726882491.08949: Set connection var ansible_shell_type to sh 25052 1726882491.08962: Set connection var ansible_timeout to 10 25052 1726882491.08974: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882491.08985: Set connection var ansible_shell_executable to /bin/sh 25052 1726882491.09018: variable 'ansible_shell_executable' from source: unknown 25052 1726882491.09108: variable 'ansible_connection' from source: unknown 25052 1726882491.09112: variable 'ansible_module_compression' from source: unknown 25052 1726882491.09114: variable 'ansible_shell_type' from source: unknown 25052 1726882491.09117: variable 'ansible_shell_executable' from source: unknown 25052 1726882491.09119: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.09121: variable 'ansible_pipelining' from source: unknown 25052 1726882491.09123: variable 'ansible_timeout' from source: unknown 25052 1726882491.09125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.09273: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 25052 1726882491.09295: variable 'omit' from source: magic vars 25052 1726882491.09306: starting attempt loop 25052 1726882491.09313: running the handler 25052 1726882491.09336: _low_level_execute_command(): starting 25052 1726882491.09349: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882491.10040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882491.10109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.10172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882491.10207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882491.10213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.10274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.11949: stdout chunk (state=3): >>>/root <<< 25052 1726882491.12105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882491.12110: stdout chunk (state=3): >>><<< 25052 1726882491.12113: stderr chunk (state=3): >>><<< 25052 1726882491.12148: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882491.12241: _low_level_execute_command(): starting 25052 1726882491.12246: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248 `" && echo ansible-tmp-1726882491.1215525-26356-119339019087248="` echo /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248 `" ) && sleep 0' 25052 1726882491.12892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.12899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.12902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.12912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.12954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882491.12976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.13029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.14927: stdout chunk (state=3): >>>ansible-tmp-1726882491.1215525-26356-119339019087248=/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248 <<< 25052 1726882491.15091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882491.15098: stdout chunk (state=3): >>><<< 25052 1726882491.15103: stderr chunk (state=3): >>><<< 25052 1726882491.15302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882491.1215525-26356-119339019087248=/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882491.15306: variable 'ansible_module_compression' from source: unknown 25052 1726882491.15309: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 25052 1726882491.15311: variable 'ansible_facts' from source: unknown 25052 1726882491.15359: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py 25052 1726882491.15547: Sending initial data 25052 1726882491.15551: Sent initial data (153 bytes) 25052 1726882491.16121: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882491.16134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882491.16146: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.16191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882491.16208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.16273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.17830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882491.17885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882491.17965: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp1lnqxvao /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py <<< 25052 1726882491.17972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py" <<< 25052 1726882491.18019: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp1lnqxvao" to remote "/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py" <<< 25052 1726882491.18900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882491.18903: stderr chunk (state=3): >>><<< 25052 1726882491.18906: stdout chunk (state=3): >>><<< 25052 1726882491.18908: done transferring module to remote 25052 1726882491.18911: _low_level_execute_command(): starting 25052 1726882491.18921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/ /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py && sleep 0' 25052 1726882491.19438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.19451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882491.19464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.19510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882491.19526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.19595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.21378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882491.21381: stdout chunk (state=3): >>><<< 25052 1726882491.21384: stderr chunk (state=3): >>><<< 25052 1726882491.21403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882491.21407: _low_level_execute_command(): starting 25052 1726882491.21409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/AnsiballZ_ping.py && sleep 0' 25052 1726882491.21870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.21907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882491.21910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882491.21914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.21916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882491.21918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882491.21920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.21956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882491.21963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.22045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.36665: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 25052 1726882491.37901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882491.37905: stdout chunk (state=3): >>><<< 25052 1726882491.37907: stderr chunk (state=3): >>><<< 25052 1726882491.37910: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882491.37913: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882491.37925: _low_level_execute_command(): starting 25052 1726882491.37928: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882491.1215525-26356-119339019087248/ > /dev/null 2>&1 && sleep 0' 25052 1726882491.38425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.38430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882491.38450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.38454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882491.38470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 25052 1726882491.38473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.38528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882491.38531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882491.38599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882491.40440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882491.40446: stderr chunk (state=3): >>><<< 25052 1726882491.40449: stdout chunk (state=3): >>><<< 25052 1726882491.40468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882491.40475: handler run complete 25052 1726882491.40496: attempt loop complete, returning result 25052 1726882491.40500: _execute() done 25052 1726882491.40502: dumping result to json 25052 1726882491.40504: done dumping result, returning 25052 1726882491.40511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-f7f6-4a6d-000000000083] 25052 1726882491.40523: sending task result for task 12673a56-9f93-f7f6-4a6d-000000000083 25052 1726882491.40670: done sending task result for task 12673a56-9f93-f7f6-4a6d-000000000083 25052 1726882491.40673: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 25052 1726882491.40838: no more pending results, returning what we have 25052 1726882491.40841: results queue empty 25052 1726882491.40842: checking for any_errors_fatal 25052 1726882491.40847: done checking for any_errors_fatal 25052 1726882491.40848: checking for max_fail_percentage 25052 1726882491.40849: done checking for max_fail_percentage 25052 1726882491.40850: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.40851: done checking to see if all hosts have failed 25052 1726882491.40852: getting the remaining hosts for this loop 25052 1726882491.40853: done getting the remaining hosts for this loop 25052 1726882491.40856: getting the next task for host managed_node2 25052 1726882491.40864: done getting next task for host managed_node2 25052 1726882491.40866: ^ task is: TASK: meta (role_complete) 25052 1726882491.40868: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.40882: getting variables 25052 1726882491.40883: in VariableManager get_vars() 25052 1726882491.40932: Calling all_inventory to load vars for managed_node2 25052 1726882491.40935: Calling groups_inventory to load vars for managed_node2 25052 1726882491.40938: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.40946: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.40949: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.40952: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.42014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.42999: done with get_vars() 25052 1726882491.43014: done getting variables 25052 1726882491.43068: done queuing things up, now waiting for results queue to drain 25052 1726882491.43070: results queue empty 25052 1726882491.43071: checking for any_errors_fatal 25052 1726882491.43073: done checking for any_errors_fatal 25052 1726882491.43074: checking for max_fail_percentage 25052 1726882491.43074: done checking for max_fail_percentage 25052 1726882491.43075: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.43075: done checking to see if all hosts have failed 25052 1726882491.43076: getting the remaining hosts for this loop 25052 1726882491.43076: done getting the remaining hosts for this loop 25052 1726882491.43078: getting the next task for host managed_node2 25052 1726882491.43082: done getting next task for host managed_node2 25052 1726882491.43083: ^ task is: TASK: Include the task 'manage_test_interface.yml' 25052 1726882491.43084: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882491.43086: getting variables 25052 1726882491.43087: in VariableManager get_vars() 25052 1726882491.43100: Calling all_inventory to load vars for managed_node2 25052 1726882491.43101: Calling groups_inventory to load vars for managed_node2 25052 1726882491.43103: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.43106: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.43107: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.43109: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.44032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.45796: done with get_vars() 25052 1726882491.45816: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:104 Friday 20 September 2024 21:34:51 -0400 (0:00:00.388) 0:00:28.413 ****** 25052 1726882491.45897: entering _queue_task() for managed_node2/include_tasks 25052 1726882491.46250: worker is 1 (out of 1 available) 25052 1726882491.46263: exiting _queue_task() for managed_node2/include_tasks 25052 1726882491.46277: done queuing things up, now waiting for results queue to drain 25052 1726882491.46278: waiting for pending results... 25052 1726882491.46616: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 25052 1726882491.46637: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000b3 25052 1726882491.46708: variable 'ansible_search_path' from source: unknown 25052 1726882491.46715: calling self._execute() 25052 1726882491.46799: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.46803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.46819: variable 'omit' from source: magic vars 25052 1726882491.47230: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.47253: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.47266: _execute() done 25052 1726882491.47269: dumping result to json 25052 1726882491.47280: done dumping result, returning 25052 1726882491.47303: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-f7f6-4a6d-0000000000b3] 25052 1726882491.47307: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b3 25052 1726882491.47432: no more pending results, returning what we have 25052 1726882491.47437: in VariableManager get_vars() 25052 1726882491.47483: Calling all_inventory to load vars for managed_node2 25052 1726882491.47486: Calling groups_inventory to load vars for managed_node2 25052 1726882491.47488: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.47502: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.47505: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.47507: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.48106: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b3 25052 1726882491.48110: WORKER PROCESS EXITING 25052 1726882491.48462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.50122: done with get_vars() 25052 1726882491.50140: variable 'ansible_search_path' from source: unknown 25052 1726882491.50154: we have included files to process 25052 1726882491.50155: generating all_blocks data 25052 1726882491.50161: done generating all_blocks data 25052 1726882491.50166: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882491.50167: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882491.50172: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 25052 1726882491.50462: in VariableManager get_vars() 25052 1726882491.50497: done with get_vars() 25052 1726882491.51080: done processing included file 25052 1726882491.51081: iterating over new_blocks loaded from include file 25052 1726882491.51082: in VariableManager get_vars() 25052 1726882491.51097: done with get_vars() 25052 1726882491.51098: filtering new block on tags 25052 1726882491.51126: done filtering new block on tags 25052 1726882491.51130: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 25052 1726882491.51136: extending task lists for all hosts with included blocks 25052 1726882491.53296: done extending task lists 25052 1726882491.53299: done processing included files 25052 1726882491.53299: results queue empty 25052 1726882491.53300: checking for any_errors_fatal 25052 1726882491.53302: done checking for any_errors_fatal 25052 1726882491.53302: checking for max_fail_percentage 25052 1726882491.53303: done checking for max_fail_percentage 25052 1726882491.53304: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.53305: done checking to see if all hosts have failed 25052 1726882491.53306: getting the remaining hosts for this loop 25052 1726882491.53307: done getting the remaining hosts for this loop 25052 1726882491.53309: getting the next task for host managed_node2 25052 1726882491.53313: done getting next task for host managed_node2 25052 1726882491.53315: ^ task is: TASK: Ensure state in ["present", "absent"] 25052 1726882491.53318: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.53320: getting variables 25052 1726882491.53321: in VariableManager get_vars() 25052 1726882491.53335: Calling all_inventory to load vars for managed_node2 25052 1726882491.53337: Calling groups_inventory to load vars for managed_node2 25052 1726882491.53339: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.53344: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.53346: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.53349: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.55034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.56453: done with get_vars() 25052 1726882491.56476: done getting variables 25052 1726882491.56524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:34:51 -0400 (0:00:00.106) 0:00:28.520 ****** 25052 1726882491.56555: entering _queue_task() for managed_node2/fail 25052 1726882491.56935: worker is 1 (out of 1 available) 25052 1726882491.56949: exiting _queue_task() for managed_node2/fail 25052 1726882491.56960: done queuing things up, now waiting for results queue to drain 25052 1726882491.56961: waiting for pending results... 25052 1726882491.57329: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 25052 1726882491.57334: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005cc 25052 1726882491.57337: variable 'ansible_search_path' from source: unknown 25052 1726882491.57339: variable 'ansible_search_path' from source: unknown 25052 1726882491.57599: calling self._execute() 25052 1726882491.57603: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.57606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.57608: variable 'omit' from source: magic vars 25052 1726882491.57901: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.57920: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.58165: variable 'state' from source: include params 25052 1726882491.58177: Evaluated conditional (state not in ["present", "absent"]): False 25052 1726882491.58180: when evaluation is False, skipping this task 25052 1726882491.58182: _execute() done 25052 1726882491.58185: dumping result to json 25052 1726882491.58188: done dumping result, returning 25052 1726882491.58197: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-f7f6-4a6d-0000000005cc] 25052 1726882491.58200: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cc skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 25052 1726882491.58365: no more pending results, returning what we have 25052 1726882491.58370: results queue empty 25052 1726882491.58371: checking for any_errors_fatal 25052 1726882491.58372: done checking for any_errors_fatal 25052 1726882491.58373: checking for max_fail_percentage 25052 1726882491.58375: done checking for max_fail_percentage 25052 1726882491.58375: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.58376: done checking to see if all hosts have failed 25052 1726882491.58377: getting the remaining hosts for this loop 25052 1726882491.58378: done getting the remaining hosts for this loop 25052 1726882491.58381: getting the next task for host managed_node2 25052 1726882491.58390: done getting next task for host managed_node2 25052 1726882491.58396: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 25052 1726882491.58400: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.58404: getting variables 25052 1726882491.58406: in VariableManager get_vars() 25052 1726882491.58448: Calling all_inventory to load vars for managed_node2 25052 1726882491.58451: Calling groups_inventory to load vars for managed_node2 25052 1726882491.58453: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.58465: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.58469: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.58471: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.59005: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cc 25052 1726882491.59008: WORKER PROCESS EXITING 25052 1726882491.59843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.62320: done with get_vars() 25052 1726882491.62350: done getting variables 25052 1726882491.62417: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:34:51 -0400 (0:00:00.058) 0:00:28.579 ****** 25052 1726882491.62450: entering _queue_task() for managed_node2/fail 25052 1726882491.62818: worker is 1 (out of 1 available) 25052 1726882491.62830: exiting _queue_task() for managed_node2/fail 25052 1726882491.62842: done queuing things up, now waiting for results queue to drain 25052 1726882491.62843: waiting for pending results... 25052 1726882491.63109: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 25052 1726882491.63214: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005cd 25052 1726882491.63236: variable 'ansible_search_path' from source: unknown 25052 1726882491.63243: variable 'ansible_search_path' from source: unknown 25052 1726882491.63285: calling self._execute() 25052 1726882491.63499: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.63503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.63507: variable 'omit' from source: magic vars 25052 1726882491.63787: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.63810: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.63954: variable 'type' from source: play vars 25052 1726882491.63967: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 25052 1726882491.63975: when evaluation is False, skipping this task 25052 1726882491.63982: _execute() done 25052 1726882491.63989: dumping result to json 25052 1726882491.64000: done dumping result, returning 25052 1726882491.64010: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-f7f6-4a6d-0000000005cd] 25052 1726882491.64020: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cd skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 25052 1726882491.64185: no more pending results, returning what we have 25052 1726882491.64188: results queue empty 25052 1726882491.64190: checking for any_errors_fatal 25052 1726882491.64199: done checking for any_errors_fatal 25052 1726882491.64201: checking for max_fail_percentage 25052 1726882491.64202: done checking for max_fail_percentage 25052 1726882491.64203: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.64204: done checking to see if all hosts have failed 25052 1726882491.64205: getting the remaining hosts for this loop 25052 1726882491.64206: done getting the remaining hosts for this loop 25052 1726882491.64209: getting the next task for host managed_node2 25052 1726882491.64216: done getting next task for host managed_node2 25052 1726882491.64219: ^ task is: TASK: Include the task 'show_interfaces.yml' 25052 1726882491.64222: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.64226: getting variables 25052 1726882491.64229: in VariableManager get_vars() 25052 1726882491.64270: Calling all_inventory to load vars for managed_node2 25052 1726882491.64273: Calling groups_inventory to load vars for managed_node2 25052 1726882491.64275: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.64289: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.64292: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.64464: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cd 25052 1726882491.64467: WORKER PROCESS EXITING 25052 1726882491.64471: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.65740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.67538: done with get_vars() 25052 1726882491.67558: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:34:51 -0400 (0:00:00.052) 0:00:28.631 ****** 25052 1726882491.67661: entering _queue_task() for managed_node2/include_tasks 25052 1726882491.68277: worker is 1 (out of 1 available) 25052 1726882491.68289: exiting _queue_task() for managed_node2/include_tasks 25052 1726882491.68300: done queuing things up, now waiting for results queue to drain 25052 1726882491.68301: waiting for pending results... 25052 1726882491.68811: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 25052 1726882491.68998: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005ce 25052 1726882491.69002: variable 'ansible_search_path' from source: unknown 25052 1726882491.69005: variable 'ansible_search_path' from source: unknown 25052 1726882491.69017: calling self._execute() 25052 1726882491.69120: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.69399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.69402: variable 'omit' from source: magic vars 25052 1726882491.69889: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.70063: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.70075: _execute() done 25052 1726882491.70085: dumping result to json 25052 1726882491.70098: done dumping result, returning 25052 1726882491.70110: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-f7f6-4a6d-0000000005ce] 25052 1726882491.70119: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005ce 25052 1726882491.70258: no more pending results, returning what we have 25052 1726882491.70263: in VariableManager get_vars() 25052 1726882491.70317: Calling all_inventory to load vars for managed_node2 25052 1726882491.70320: Calling groups_inventory to load vars for managed_node2 25052 1726882491.70323: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.70337: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.70340: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.70344: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.70863: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005ce 25052 1726882491.70866: WORKER PROCESS EXITING 25052 1726882491.71970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.73605: done with get_vars() 25052 1726882491.73625: variable 'ansible_search_path' from source: unknown 25052 1726882491.73627: variable 'ansible_search_path' from source: unknown 25052 1726882491.73669: we have included files to process 25052 1726882491.73670: generating all_blocks data 25052 1726882491.73673: done generating all_blocks data 25052 1726882491.73678: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882491.73679: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882491.73681: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 25052 1726882491.73798: in VariableManager get_vars() 25052 1726882491.73822: done with get_vars() 25052 1726882491.73929: done processing included file 25052 1726882491.73931: iterating over new_blocks loaded from include file 25052 1726882491.73932: in VariableManager get_vars() 25052 1726882491.73950: done with get_vars() 25052 1726882491.73951: filtering new block on tags 25052 1726882491.73968: done filtering new block on tags 25052 1726882491.73970: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 25052 1726882491.73975: extending task lists for all hosts with included blocks 25052 1726882491.74358: done extending task lists 25052 1726882491.74360: done processing included files 25052 1726882491.74360: results queue empty 25052 1726882491.74361: checking for any_errors_fatal 25052 1726882491.74364: done checking for any_errors_fatal 25052 1726882491.74365: checking for max_fail_percentage 25052 1726882491.74365: done checking for max_fail_percentage 25052 1726882491.74366: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.74367: done checking to see if all hosts have failed 25052 1726882491.74368: getting the remaining hosts for this loop 25052 1726882491.74369: done getting the remaining hosts for this loop 25052 1726882491.74371: getting the next task for host managed_node2 25052 1726882491.74375: done getting next task for host managed_node2 25052 1726882491.74377: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 25052 1726882491.74380: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.74382: getting variables 25052 1726882491.74383: in VariableManager get_vars() 25052 1726882491.74401: Calling all_inventory to load vars for managed_node2 25052 1726882491.74403: Calling groups_inventory to load vars for managed_node2 25052 1726882491.74405: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.74410: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.74412: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.74414: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.76110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.78822: done with get_vars() 25052 1726882491.78852: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:34:51 -0400 (0:00:00.115) 0:00:28.747 ****** 25052 1726882491.79226: entering _queue_task() for managed_node2/include_tasks 25052 1726882491.79932: worker is 1 (out of 1 available) 25052 1726882491.79945: exiting _queue_task() for managed_node2/include_tasks 25052 1726882491.80071: done queuing things up, now waiting for results queue to drain 25052 1726882491.80073: waiting for pending results... 25052 1726882491.80644: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 25052 1726882491.80985: in run() - task 12673a56-9f93-f7f6-4a6d-0000000006e4 25052 1726882491.81000: variable 'ansible_search_path' from source: unknown 25052 1726882491.81004: variable 'ansible_search_path' from source: unknown 25052 1726882491.81038: calling self._execute() 25052 1726882491.81271: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.81506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.81517: variable 'omit' from source: magic vars 25052 1726882491.82780: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.82797: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.82912: _execute() done 25052 1726882491.82916: dumping result to json 25052 1726882491.82925: done dumping result, returning 25052 1726882491.82934: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-f7f6-4a6d-0000000006e4] 25052 1726882491.82937: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000006e4 25052 1726882491.83052: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000006e4 25052 1726882491.83055: WORKER PROCESS EXITING 25052 1726882491.83095: no more pending results, returning what we have 25052 1726882491.83102: in VariableManager get_vars() 25052 1726882491.83158: Calling all_inventory to load vars for managed_node2 25052 1726882491.83161: Calling groups_inventory to load vars for managed_node2 25052 1726882491.83164: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.83180: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.83184: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.83186: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.86499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.89312: done with get_vars() 25052 1726882491.89341: variable 'ansible_search_path' from source: unknown 25052 1726882491.89342: variable 'ansible_search_path' from source: unknown 25052 1726882491.89409: we have included files to process 25052 1726882491.89410: generating all_blocks data 25052 1726882491.89412: done generating all_blocks data 25052 1726882491.89414: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882491.89415: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882491.89417: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 25052 1726882491.89732: done processing included file 25052 1726882491.89735: iterating over new_blocks loaded from include file 25052 1726882491.89736: in VariableManager get_vars() 25052 1726882491.89796: done with get_vars() 25052 1726882491.89798: filtering new block on tags 25052 1726882491.89870: done filtering new block on tags 25052 1726882491.89873: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 25052 1726882491.89878: extending task lists for all hosts with included blocks 25052 1726882491.90234: done extending task lists 25052 1726882491.90236: done processing included files 25052 1726882491.90236: results queue empty 25052 1726882491.90237: checking for any_errors_fatal 25052 1726882491.90240: done checking for any_errors_fatal 25052 1726882491.90241: checking for max_fail_percentage 25052 1726882491.90242: done checking for max_fail_percentage 25052 1726882491.90243: checking to see if all hosts have failed and the running result is not ok 25052 1726882491.90244: done checking to see if all hosts have failed 25052 1726882491.90244: getting the remaining hosts for this loop 25052 1726882491.90246: done getting the remaining hosts for this loop 25052 1726882491.90364: getting the next task for host managed_node2 25052 1726882491.90370: done getting next task for host managed_node2 25052 1726882491.90373: ^ task is: TASK: Gather current interface info 25052 1726882491.90377: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882491.90379: getting variables 25052 1726882491.90380: in VariableManager get_vars() 25052 1726882491.90396: Calling all_inventory to load vars for managed_node2 25052 1726882491.90399: Calling groups_inventory to load vars for managed_node2 25052 1726882491.90401: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882491.90406: Calling all_plugins_play to load vars for managed_node2 25052 1726882491.90409: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882491.90411: Calling groups_plugins_play to load vars for managed_node2 25052 1726882491.93222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882491.95392: done with get_vars() 25052 1726882491.95426: done getting variables 25052 1726882491.95473: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:34:51 -0400 (0:00:00.162) 0:00:28.909 ****** 25052 1726882491.95509: entering _queue_task() for managed_node2/command 25052 1726882491.95931: worker is 1 (out of 1 available) 25052 1726882491.95944: exiting _queue_task() for managed_node2/command 25052 1726882491.96297: done queuing things up, now waiting for results queue to drain 25052 1726882491.96299: waiting for pending results... 25052 1726882491.96808: running TaskExecutor() for managed_node2/TASK: Gather current interface info 25052 1726882491.96902: in run() - task 12673a56-9f93-f7f6-4a6d-00000000071b 25052 1726882491.96908: variable 'ansible_search_path' from source: unknown 25052 1726882491.96912: variable 'ansible_search_path' from source: unknown 25052 1726882491.96940: calling self._execute() 25052 1726882491.97201: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.97204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.97207: variable 'omit' from source: magic vars 25052 1726882491.97838: variable 'ansible_distribution_major_version' from source: facts 25052 1726882491.97851: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882491.97857: variable 'omit' from source: magic vars 25052 1726882491.97907: variable 'omit' from source: magic vars 25052 1726882491.98001: variable 'omit' from source: magic vars 25052 1726882491.98004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882491.98135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882491.98138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882491.98140: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882491.98143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882491.98152: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882491.98159: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.98196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.98317: Set connection var ansible_pipelining to False 25052 1726882491.98321: Set connection var ansible_connection to ssh 25052 1726882491.98323: Set connection var ansible_shell_type to sh 25052 1726882491.98330: Set connection var ansible_timeout to 10 25052 1726882491.98338: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882491.98343: Set connection var ansible_shell_executable to /bin/sh 25052 1726882491.98485: variable 'ansible_shell_executable' from source: unknown 25052 1726882491.98489: variable 'ansible_connection' from source: unknown 25052 1726882491.98496: variable 'ansible_module_compression' from source: unknown 25052 1726882491.98499: variable 'ansible_shell_type' from source: unknown 25052 1726882491.98570: variable 'ansible_shell_executable' from source: unknown 25052 1726882491.98573: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882491.98576: variable 'ansible_pipelining' from source: unknown 25052 1726882491.98578: variable 'ansible_timeout' from source: unknown 25052 1726882491.98581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882491.98771: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882491.98784: variable 'omit' from source: magic vars 25052 1726882491.98788: starting attempt loop 25052 1726882491.98794: running the handler 25052 1726882491.98920: _low_level_execute_command(): starting 25052 1726882491.98928: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882491.99835: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882491.99964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882491.99969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882491.99971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.00004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.00109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.02022: stdout chunk (state=3): >>>/root <<< 25052 1726882492.02026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.02180: stderr chunk (state=3): >>><<< 25052 1726882492.02183: stdout chunk (state=3): >>><<< 25052 1726882492.02215: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.02219: _low_level_execute_command(): starting 25052 1726882492.02222: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746 `" && echo ansible-tmp-1726882492.0220008-26413-13893349833746="` echo /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746 `" ) && sleep 0' 25052 1726882492.04028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.04244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.04375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.06219: stdout chunk (state=3): >>>ansible-tmp-1726882492.0220008-26413-13893349833746=/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746 <<< 25052 1726882492.06317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.06357: stderr chunk (state=3): >>><<< 25052 1726882492.06362: stdout chunk (state=3): >>><<< 25052 1726882492.06380: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882492.0220008-26413-13893349833746=/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.06417: variable 'ansible_module_compression' from source: unknown 25052 1726882492.06472: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882492.06512: variable 'ansible_facts' from source: unknown 25052 1726882492.06590: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py 25052 1726882492.06725: Sending initial data 25052 1726882492.06728: Sent initial data (155 bytes) 25052 1726882492.07252: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.07256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882492.07258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882492.07260: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882492.07263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.07312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882492.07316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.07388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.08913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882492.08968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882492.09026: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpkchmkhbv /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py <<< 25052 1726882492.09034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py" <<< 25052 1726882492.09084: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpkchmkhbv" to remote "/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py" <<< 25052 1726882492.10002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.10020: stderr chunk (state=3): >>><<< 25052 1726882492.10025: stdout chunk (state=3): >>><<< 25052 1726882492.10080: done transferring module to remote 25052 1726882492.10084: _low_level_execute_command(): starting 25052 1726882492.10086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/ /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py && sleep 0' 25052 1726882492.10984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882492.11004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882492.11022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.11055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882492.11084: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882492.11199: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.11214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.11317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.13050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.13058: stdout chunk (state=3): >>><<< 25052 1726882492.13060: stderr chunk (state=3): >>><<< 25052 1726882492.13074: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.13078: _low_level_execute_command(): starting 25052 1726882492.13084: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/AnsiballZ_command.py && sleep 0' 25052 1726882492.13549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.13552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882492.13555: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.13557: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.13559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.13623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.13625: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.13689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.28912: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:52.284965", "end": "2024-09-20 21:34:52.288103", "delta": "0:00:00.003138", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882492.30336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882492.30356: stderr chunk (state=3): >>><<< 25052 1726882492.30360: stdout chunk (state=3): >>><<< 25052 1726882492.30377: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nveth0", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:34:52.284965", "end": "2024-09-20 21:34:52.288103", "delta": "0:00:00.003138", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882492.30427: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882492.30431: _low_level_execute_command(): starting 25052 1726882492.30435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882492.0220008-26413-13893349833746/ > /dev/null 2>&1 && sleep 0' 25052 1726882492.30922: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.30925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.30927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.30930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.30983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882492.30997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.31053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.33008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.33011: stdout chunk (state=3): >>><<< 25052 1726882492.33014: stderr chunk (state=3): >>><<< 25052 1726882492.33016: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.33018: handler run complete 25052 1726882492.33020: Evaluated conditional (False): False 25052 1726882492.33022: attempt loop complete, returning result 25052 1726882492.33024: _execute() done 25052 1726882492.33026: dumping result to json 25052 1726882492.33028: done dumping result, returning 25052 1726882492.33030: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-f7f6-4a6d-00000000071b] 25052 1726882492.33032: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000071b 25052 1726882492.33127: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000071b 25052 1726882492.33131: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003138", "end": "2024-09-20 21:34:52.288103", "rc": 0, "start": "2024-09-20 21:34:52.284965" } STDOUT: bonding_masters eth0 lo veth0 25052 1726882492.33236: no more pending results, returning what we have 25052 1726882492.33240: results queue empty 25052 1726882492.33241: checking for any_errors_fatal 25052 1726882492.33242: done checking for any_errors_fatal 25052 1726882492.33243: checking for max_fail_percentage 25052 1726882492.33244: done checking for max_fail_percentage 25052 1726882492.33245: checking to see if all hosts have failed and the running result is not ok 25052 1726882492.33246: done checking to see if all hosts have failed 25052 1726882492.33247: getting the remaining hosts for this loop 25052 1726882492.33248: done getting the remaining hosts for this loop 25052 1726882492.33252: getting the next task for host managed_node2 25052 1726882492.33259: done getting next task for host managed_node2 25052 1726882492.33261: ^ task is: TASK: Set current_interfaces 25052 1726882492.33267: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882492.33271: getting variables 25052 1726882492.33273: in VariableManager get_vars() 25052 1726882492.33335: Calling all_inventory to load vars for managed_node2 25052 1726882492.33339: Calling groups_inventory to load vars for managed_node2 25052 1726882492.33341: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882492.33352: Calling all_plugins_play to load vars for managed_node2 25052 1726882492.33354: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882492.33357: Calling groups_plugins_play to load vars for managed_node2 25052 1726882492.34287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882492.35752: done with get_vars() 25052 1726882492.35776: done getting variables 25052 1726882492.35841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:34:52 -0400 (0:00:00.403) 0:00:29.313 ****** 25052 1726882492.35873: entering _queue_task() for managed_node2/set_fact 25052 1726882492.36243: worker is 1 (out of 1 available) 25052 1726882492.36257: exiting _queue_task() for managed_node2/set_fact 25052 1726882492.36272: done queuing things up, now waiting for results queue to drain 25052 1726882492.36274: waiting for pending results... 25052 1726882492.36555: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 25052 1726882492.36719: in run() - task 12673a56-9f93-f7f6-4a6d-00000000071c 25052 1726882492.36723: variable 'ansible_search_path' from source: unknown 25052 1726882492.36726: variable 'ansible_search_path' from source: unknown 25052 1726882492.36768: calling self._execute() 25052 1726882492.36872: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.36883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.37099: variable 'omit' from source: magic vars 25052 1726882492.37300: variable 'ansible_distribution_major_version' from source: facts 25052 1726882492.37314: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882492.37319: variable 'omit' from source: magic vars 25052 1726882492.37361: variable 'omit' from source: magic vars 25052 1726882492.37441: variable '_current_interfaces' from source: set_fact 25052 1726882492.37489: variable 'omit' from source: magic vars 25052 1726882492.37527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882492.37553: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882492.37588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882492.37612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.37616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.37646: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882492.37649: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.37652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.37741: Set connection var ansible_pipelining to False 25052 1726882492.37745: Set connection var ansible_connection to ssh 25052 1726882492.37748: Set connection var ansible_shell_type to sh 25052 1726882492.37753: Set connection var ansible_timeout to 10 25052 1726882492.37769: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882492.37772: Set connection var ansible_shell_executable to /bin/sh 25052 1726882492.37800: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.37804: variable 'ansible_connection' from source: unknown 25052 1726882492.37807: variable 'ansible_module_compression' from source: unknown 25052 1726882492.37809: variable 'ansible_shell_type' from source: unknown 25052 1726882492.37830: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.37832: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.37839: variable 'ansible_pipelining' from source: unknown 25052 1726882492.37842: variable 'ansible_timeout' from source: unknown 25052 1726882492.37844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.37984: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882492.37988: variable 'omit' from source: magic vars 25052 1726882492.37991: starting attempt loop 25052 1726882492.37994: running the handler 25052 1726882492.38005: handler run complete 25052 1726882492.38021: attempt loop complete, returning result 25052 1726882492.38024: _execute() done 25052 1726882492.38026: dumping result to json 25052 1726882492.38028: done dumping result, returning 25052 1726882492.38050: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-f7f6-4a6d-00000000071c] 25052 1726882492.38053: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000071c 25052 1726882492.38155: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000071c 25052 1726882492.38158: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "veth0" ] }, "changed": false } 25052 1726882492.38225: no more pending results, returning what we have 25052 1726882492.38228: results queue empty 25052 1726882492.38229: checking for any_errors_fatal 25052 1726882492.38238: done checking for any_errors_fatal 25052 1726882492.38238: checking for max_fail_percentage 25052 1726882492.38240: done checking for max_fail_percentage 25052 1726882492.38241: checking to see if all hosts have failed and the running result is not ok 25052 1726882492.38242: done checking to see if all hosts have failed 25052 1726882492.38243: getting the remaining hosts for this loop 25052 1726882492.38244: done getting the remaining hosts for this loop 25052 1726882492.38247: getting the next task for host managed_node2 25052 1726882492.38256: done getting next task for host managed_node2 25052 1726882492.38258: ^ task is: TASK: Show current_interfaces 25052 1726882492.38262: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882492.38266: getting variables 25052 1726882492.38267: in VariableManager get_vars() 25052 1726882492.38310: Calling all_inventory to load vars for managed_node2 25052 1726882492.38313: Calling groups_inventory to load vars for managed_node2 25052 1726882492.38315: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882492.38324: Calling all_plugins_play to load vars for managed_node2 25052 1726882492.38326: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882492.38328: Calling groups_plugins_play to load vars for managed_node2 25052 1726882492.39674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882492.40849: done with get_vars() 25052 1726882492.40868: done getting variables 25052 1726882492.40915: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:34:52 -0400 (0:00:00.050) 0:00:29.364 ****** 25052 1726882492.40939: entering _queue_task() for managed_node2/debug 25052 1726882492.41198: worker is 1 (out of 1 available) 25052 1726882492.41212: exiting _queue_task() for managed_node2/debug 25052 1726882492.41224: done queuing things up, now waiting for results queue to drain 25052 1726882492.41225: waiting for pending results... 25052 1726882492.41406: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 25052 1726882492.41485: in run() - task 12673a56-9f93-f7f6-4a6d-0000000006e5 25052 1726882492.41499: variable 'ansible_search_path' from source: unknown 25052 1726882492.41503: variable 'ansible_search_path' from source: unknown 25052 1726882492.41536: calling self._execute() 25052 1726882492.41611: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.41616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.41624: variable 'omit' from source: magic vars 25052 1726882492.41905: variable 'ansible_distribution_major_version' from source: facts 25052 1726882492.41915: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882492.41921: variable 'omit' from source: magic vars 25052 1726882492.41948: variable 'omit' from source: magic vars 25052 1726882492.42020: variable 'current_interfaces' from source: set_fact 25052 1726882492.42041: variable 'omit' from source: magic vars 25052 1726882492.42071: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882492.42101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882492.42120: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882492.42134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.42144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.42167: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882492.42170: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.42173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.42248: Set connection var ansible_pipelining to False 25052 1726882492.42251: Set connection var ansible_connection to ssh 25052 1726882492.42253: Set connection var ansible_shell_type to sh 25052 1726882492.42259: Set connection var ansible_timeout to 10 25052 1726882492.42265: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882492.42270: Set connection var ansible_shell_executable to /bin/sh 25052 1726882492.42304: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.42307: variable 'ansible_connection' from source: unknown 25052 1726882492.42310: variable 'ansible_module_compression' from source: unknown 25052 1726882492.42312: variable 'ansible_shell_type' from source: unknown 25052 1726882492.42315: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.42317: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.42319: variable 'ansible_pipelining' from source: unknown 25052 1726882492.42323: variable 'ansible_timeout' from source: unknown 25052 1726882492.42325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.42437: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882492.42449: variable 'omit' from source: magic vars 25052 1726882492.42452: starting attempt loop 25052 1726882492.42455: running the handler 25052 1726882492.42497: handler run complete 25052 1726882492.42537: attempt loop complete, returning result 25052 1726882492.42540: _execute() done 25052 1726882492.42543: dumping result to json 25052 1726882492.42544: done dumping result, returning 25052 1726882492.42547: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-f7f6-4a6d-0000000006e5] 25052 1726882492.42549: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000006e5 25052 1726882492.42630: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000006e5 25052 1726882492.42633: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'veth0'] 25052 1726882492.42705: no more pending results, returning what we have 25052 1726882492.42708: results queue empty 25052 1726882492.42709: checking for any_errors_fatal 25052 1726882492.42716: done checking for any_errors_fatal 25052 1726882492.42716: checking for max_fail_percentage 25052 1726882492.42718: done checking for max_fail_percentage 25052 1726882492.42719: checking to see if all hosts have failed and the running result is not ok 25052 1726882492.42720: done checking to see if all hosts have failed 25052 1726882492.42720: getting the remaining hosts for this loop 25052 1726882492.42722: done getting the remaining hosts for this loop 25052 1726882492.42725: getting the next task for host managed_node2 25052 1726882492.42732: done getting next task for host managed_node2 25052 1726882492.42735: ^ task is: TASK: Install iproute 25052 1726882492.42738: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882492.42743: getting variables 25052 1726882492.42744: in VariableManager get_vars() 25052 1726882492.42778: Calling all_inventory to load vars for managed_node2 25052 1726882492.42781: Calling groups_inventory to load vars for managed_node2 25052 1726882492.42783: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882492.42796: Calling all_plugins_play to load vars for managed_node2 25052 1726882492.42798: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882492.42801: Calling groups_plugins_play to load vars for managed_node2 25052 1726882492.43743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882492.44739: done with get_vars() 25052 1726882492.44773: done getting variables 25052 1726882492.44859: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:34:52 -0400 (0:00:00.039) 0:00:29.403 ****** 25052 1726882492.44909: entering _queue_task() for managed_node2/package 25052 1726882492.45319: worker is 1 (out of 1 available) 25052 1726882492.45336: exiting _queue_task() for managed_node2/package 25052 1726882492.45346: done queuing things up, now waiting for results queue to drain 25052 1726882492.45351: waiting for pending results... 25052 1726882492.45820: running TaskExecutor() for managed_node2/TASK: Install iproute 25052 1726882492.45825: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005cf 25052 1726882492.45830: variable 'ansible_search_path' from source: unknown 25052 1726882492.45865: variable 'ansible_search_path' from source: unknown 25052 1726882492.45888: calling self._execute() 25052 1726882492.46034: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.46039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.46047: variable 'omit' from source: magic vars 25052 1726882492.46400: variable 'ansible_distribution_major_version' from source: facts 25052 1726882492.46406: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882492.46419: variable 'omit' from source: magic vars 25052 1726882492.46443: variable 'omit' from source: magic vars 25052 1726882492.46674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 25052 1726882492.48884: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 25052 1726882492.49011: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 25052 1726882492.49014: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 25052 1726882492.49188: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 25052 1726882492.49224: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 25052 1726882492.49335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 25052 1726882492.49415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 25052 1726882492.49447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 25052 1726882492.49520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 25052 1726882492.49547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 25052 1726882492.49631: variable '__network_is_ostree' from source: set_fact 25052 1726882492.49635: variable 'omit' from source: magic vars 25052 1726882492.49658: variable 'omit' from source: magic vars 25052 1726882492.49681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882492.49707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882492.49722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882492.49735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.49744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882492.49767: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882492.49770: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.49774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.49855: Set connection var ansible_pipelining to False 25052 1726882492.49858: Set connection var ansible_connection to ssh 25052 1726882492.49861: Set connection var ansible_shell_type to sh 25052 1726882492.49866: Set connection var ansible_timeout to 10 25052 1726882492.49873: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882492.49877: Set connection var ansible_shell_executable to /bin/sh 25052 1726882492.49898: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.49901: variable 'ansible_connection' from source: unknown 25052 1726882492.49904: variable 'ansible_module_compression' from source: unknown 25052 1726882492.49906: variable 'ansible_shell_type' from source: unknown 25052 1726882492.49910: variable 'ansible_shell_executable' from source: unknown 25052 1726882492.49912: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882492.49915: variable 'ansible_pipelining' from source: unknown 25052 1726882492.49917: variable 'ansible_timeout' from source: unknown 25052 1726882492.49926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882492.49989: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882492.50002: variable 'omit' from source: magic vars 25052 1726882492.50006: starting attempt loop 25052 1726882492.50008: running the handler 25052 1726882492.50016: variable 'ansible_facts' from source: unknown 25052 1726882492.50020: variable 'ansible_facts' from source: unknown 25052 1726882492.50049: _low_level_execute_command(): starting 25052 1726882492.50057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882492.50536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.50540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.50543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.50545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.50590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882492.50599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.50602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.50685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.52265: stdout chunk (state=3): >>>/root <<< 25052 1726882492.52426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.52429: stdout chunk (state=3): >>><<< 25052 1726882492.52431: stderr chunk (state=3): >>><<< 25052 1726882492.52547: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.52556: _low_level_execute_command(): starting 25052 1726882492.52558: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384 `" && echo ansible-tmp-1726882492.524698-26438-261813239766384="` echo /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384 `" ) && sleep 0' 25052 1726882492.53357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882492.53361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882492.53363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882492.53365: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.53367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.53417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.53434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.53527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.55433: stdout chunk (state=3): >>>ansible-tmp-1726882492.524698-26438-261813239766384=/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384 <<< 25052 1726882492.55570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.55574: stdout chunk (state=3): >>><<< 25052 1726882492.55577: stderr chunk (state=3): >>><<< 25052 1726882492.55800: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882492.524698-26438-261813239766384=/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.55803: variable 'ansible_module_compression' from source: unknown 25052 1726882492.55806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 25052 1726882492.55808: variable 'ansible_facts' from source: unknown 25052 1726882492.55881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py 25052 1726882492.56049: Sending initial data 25052 1726882492.56052: Sent initial data (151 bytes) 25052 1726882492.56492: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882492.56507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.56519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.56562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882492.56587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.56648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.58195: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882492.58271: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882492.58339: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp65ae5mq3 /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py <<< 25052 1726882492.58343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py" <<< 25052 1726882492.58457: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp65ae5mq3" to remote "/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py" <<< 25052 1726882492.59585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.59692: stderr chunk (state=3): >>><<< 25052 1726882492.59698: stdout chunk (state=3): >>><<< 25052 1726882492.59727: done transferring module to remote 25052 1726882492.59749: _low_level_execute_command(): starting 25052 1726882492.59778: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/ /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py && sleep 0' 25052 1726882492.60191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.60199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.60202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.60204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.60255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.60258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.60323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882492.62174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882492.62178: stdout chunk (state=3): >>><<< 25052 1726882492.62180: stderr chunk (state=3): >>><<< 25052 1726882492.62182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882492.62185: _low_level_execute_command(): starting 25052 1726882492.62187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/AnsiballZ_dnf.py && sleep 0' 25052 1726882492.62759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882492.62773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882492.62788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882492.62813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882492.62868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882492.62939: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882492.62975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882492.62997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882492.63197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.03137: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 25052 1726882493.07132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882493.07146: stdout chunk (state=3): >>><<< 25052 1726882493.07161: stderr chunk (state=3): >>><<< 25052 1726882493.07185: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882493.07300: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882493.07308: _low_level_execute_command(): starting 25052 1726882493.07311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882492.524698-26438-261813239766384/ > /dev/null 2>&1 && sleep 0' 25052 1726882493.07984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882493.07998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 25052 1726882493.08082: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.08102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.08134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882493.08138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.08219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.10142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.10146: stderr chunk (state=3): >>><<< 25052 1726882493.10148: stdout chunk (state=3): >>><<< 25052 1726882493.10199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882493.10203: handler run complete 25052 1726882493.10477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 25052 1726882493.10766: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 25052 1726882493.10806: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 25052 1726882493.10838: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 25052 1726882493.10967: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 25052 1726882493.11125: variable '__install_status' from source: set_fact 25052 1726882493.11169: Evaluated conditional (__install_status is success): True 25052 1726882493.11172: attempt loop complete, returning result 25052 1726882493.11175: _execute() done 25052 1726882493.11177: dumping result to json 25052 1726882493.11192: done dumping result, returning 25052 1726882493.11231: done running TaskExecutor() for managed_node2/TASK: Install iproute [12673a56-9f93-f7f6-4a6d-0000000005cf] 25052 1726882493.11234: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cf ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 25052 1726882493.11432: no more pending results, returning what we have 25052 1726882493.11436: results queue empty 25052 1726882493.11437: checking for any_errors_fatal 25052 1726882493.11444: done checking for any_errors_fatal 25052 1726882493.11445: checking for max_fail_percentage 25052 1726882493.11447: done checking for max_fail_percentage 25052 1726882493.11448: checking to see if all hosts have failed and the running result is not ok 25052 1726882493.11449: done checking to see if all hosts have failed 25052 1726882493.11449: getting the remaining hosts for this loop 25052 1726882493.11451: done getting the remaining hosts for this loop 25052 1726882493.11453: getting the next task for host managed_node2 25052 1726882493.11460: done getting next task for host managed_node2 25052 1726882493.11463: ^ task is: TASK: Create veth interface {{ interface }} 25052 1726882493.11465: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882493.11469: getting variables 25052 1726882493.11471: in VariableManager get_vars() 25052 1726882493.11521: Calling all_inventory to load vars for managed_node2 25052 1726882493.11525: Calling groups_inventory to load vars for managed_node2 25052 1726882493.11528: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882493.11541: Calling all_plugins_play to load vars for managed_node2 25052 1726882493.11544: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882493.11548: Calling groups_plugins_play to load vars for managed_node2 25052 1726882493.12105: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005cf 25052 1726882493.12109: WORKER PROCESS EXITING 25052 1726882493.12633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882493.14246: done with get_vars() 25052 1726882493.14274: done getting variables 25052 1726882493.14498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882493.14770: variable 'interface' from source: play vars TASK [Create veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:34:53 -0400 (0:00:00.698) 0:00:30.102 ****** 25052 1726882493.14804: entering _queue_task() for managed_node2/command 25052 1726882493.15153: worker is 1 (out of 1 available) 25052 1726882493.15167: exiting _queue_task() for managed_node2/command 25052 1726882493.15179: done queuing things up, now waiting for results queue to drain 25052 1726882493.15180: waiting for pending results... 25052 1726882493.15416: running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 25052 1726882493.15531: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d0 25052 1726882493.15555: variable 'ansible_search_path' from source: unknown 25052 1726882493.15562: variable 'ansible_search_path' from source: unknown 25052 1726882493.15870: variable 'interface' from source: play vars 25052 1726882493.15965: variable 'interface' from source: play vars 25052 1726882493.16057: variable 'interface' from source: play vars 25052 1726882493.16236: Loaded config def from plugin (lookup/items) 25052 1726882493.16248: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 25052 1726882493.16285: variable 'omit' from source: magic vars 25052 1726882493.16543: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.16606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.16646: variable 'omit' from source: magic vars 25052 1726882493.17069: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.17073: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.17269: variable 'type' from source: play vars 25052 1726882493.17285: variable 'state' from source: include params 25052 1726882493.17288: variable 'interface' from source: play vars 25052 1726882493.17295: variable 'current_interfaces' from source: set_fact 25052 1726882493.17326: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25052 1726882493.17331: when evaluation is False, skipping this task 25052 1726882493.17334: variable 'item' from source: unknown 25052 1726882493.17399: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link add veth0 type veth peer name peerveth0) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link add veth0 type veth peer name peerveth0", "skip_reason": "Conditional result was False" } 25052 1726882493.17888: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.17900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.17903: variable 'omit' from source: magic vars 25052 1726882493.17906: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.17915: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.17918: variable 'type' from source: play vars 25052 1726882493.17920: variable 'state' from source: include params 25052 1726882493.17922: variable 'interface' from source: play vars 25052 1726882493.17924: variable 'current_interfaces' from source: set_fact 25052 1726882493.17926: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25052 1726882493.17929: when evaluation is False, skipping this task 25052 1726882493.17930: variable 'item' from source: unknown 25052 1726882493.17932: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set peerveth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set peerveth0 up", "skip_reason": "Conditional result was False" } 25052 1726882493.18202: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.18205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.18207: variable 'omit' from source: magic vars 25052 1726882493.18210: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.18212: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.18401: variable 'type' from source: play vars 25052 1726882493.18404: variable 'state' from source: include params 25052 1726882493.18406: variable 'interface' from source: play vars 25052 1726882493.18409: variable 'current_interfaces' from source: set_fact 25052 1726882493.18413: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): False 25052 1726882493.18415: when evaluation is False, skipping this task 25052 1726882493.18417: variable 'item' from source: unknown 25052 1726882493.18420: variable 'item' from source: unknown skipping: [managed_node2] => (item=ip link set veth0 up) => { "ansible_loop_var": "item", "changed": false, "false_condition": "type == 'veth' and state == 'present' and interface not in current_interfaces", "item": "ip link set veth0 up", "skip_reason": "Conditional result was False" } 25052 1726882493.18551: dumping result to json 25052 1726882493.18554: done dumping result, returning 25052 1726882493.18556: done running TaskExecutor() for managed_node2/TASK: Create veth interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d0] 25052 1726882493.18558: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d0 25052 1726882493.18596: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d0 25052 1726882493.18599: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false } MSG: All items skipped 25052 1726882493.18687: no more pending results, returning what we have 25052 1726882493.18690: results queue empty 25052 1726882493.18695: checking for any_errors_fatal 25052 1726882493.18701: done checking for any_errors_fatal 25052 1726882493.18701: checking for max_fail_percentage 25052 1726882493.18703: done checking for max_fail_percentage 25052 1726882493.18703: checking to see if all hosts have failed and the running result is not ok 25052 1726882493.18704: done checking to see if all hosts have failed 25052 1726882493.18705: getting the remaining hosts for this loop 25052 1726882493.18706: done getting the remaining hosts for this loop 25052 1726882493.18708: getting the next task for host managed_node2 25052 1726882493.18713: done getting next task for host managed_node2 25052 1726882493.18715: ^ task is: TASK: Set up veth as managed by NetworkManager 25052 1726882493.18718: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882493.18722: getting variables 25052 1726882493.18724: in VariableManager get_vars() 25052 1726882493.18758: Calling all_inventory to load vars for managed_node2 25052 1726882493.18761: Calling groups_inventory to load vars for managed_node2 25052 1726882493.18763: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882493.18771: Calling all_plugins_play to load vars for managed_node2 25052 1726882493.18774: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882493.18776: Calling groups_plugins_play to load vars for managed_node2 25052 1726882493.20220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882493.21922: done with get_vars() 25052 1726882493.21949: done getting variables 25052 1726882493.22010: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:34:53 -0400 (0:00:00.072) 0:00:30.175 ****** 25052 1726882493.22059: entering _queue_task() for managed_node2/command 25052 1726882493.22648: worker is 1 (out of 1 available) 25052 1726882493.22658: exiting _queue_task() for managed_node2/command 25052 1726882493.22669: done queuing things up, now waiting for results queue to drain 25052 1726882493.22670: waiting for pending results... 25052 1726882493.22860: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 25052 1726882493.22928: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d1 25052 1726882493.22932: variable 'ansible_search_path' from source: unknown 25052 1726882493.22935: variable 'ansible_search_path' from source: unknown 25052 1726882493.23034: calling self._execute() 25052 1726882493.23063: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.23070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.23082: variable 'omit' from source: magic vars 25052 1726882493.23450: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.23467: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.23625: variable 'type' from source: play vars 25052 1726882493.23629: variable 'state' from source: include params 25052 1726882493.23636: Evaluated conditional (type == 'veth' and state == 'present'): False 25052 1726882493.23639: when evaluation is False, skipping this task 25052 1726882493.23642: _execute() done 25052 1726882493.23644: dumping result to json 25052 1726882493.23646: done dumping result, returning 25052 1726882493.23654: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-f7f6-4a6d-0000000005d1] 25052 1726882493.23666: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d1 25052 1726882493.23755: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d1 25052 1726882493.23757: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'present'", "skip_reason": "Conditional result was False" } 25052 1726882493.23817: no more pending results, returning what we have 25052 1726882493.23822: results queue empty 25052 1726882493.23823: checking for any_errors_fatal 25052 1726882493.23838: done checking for any_errors_fatal 25052 1726882493.23839: checking for max_fail_percentage 25052 1726882493.23841: done checking for max_fail_percentage 25052 1726882493.23842: checking to see if all hosts have failed and the running result is not ok 25052 1726882493.23843: done checking to see if all hosts have failed 25052 1726882493.23844: getting the remaining hosts for this loop 25052 1726882493.23846: done getting the remaining hosts for this loop 25052 1726882493.23850: getting the next task for host managed_node2 25052 1726882493.23858: done getting next task for host managed_node2 25052 1726882493.23861: ^ task is: TASK: Delete veth interface {{ interface }} 25052 1726882493.23865: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882493.23870: getting variables 25052 1726882493.23872: in VariableManager get_vars() 25052 1726882493.23919: Calling all_inventory to load vars for managed_node2 25052 1726882493.23922: Calling groups_inventory to load vars for managed_node2 25052 1726882493.23924: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882493.23938: Calling all_plugins_play to load vars for managed_node2 25052 1726882493.23941: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882493.23944: Calling groups_plugins_play to load vars for managed_node2 25052 1726882493.25452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882493.27226: done with get_vars() 25052 1726882493.27327: done getting variables 25052 1726882493.27478: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882493.27596: variable 'interface' from source: play vars TASK [Delete veth interface veth0] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:34:53 -0400 (0:00:00.055) 0:00:30.231 ****** 25052 1726882493.27628: entering _queue_task() for managed_node2/command 25052 1726882493.27961: worker is 1 (out of 1 available) 25052 1726882493.27973: exiting _queue_task() for managed_node2/command 25052 1726882493.27986: done queuing things up, now waiting for results queue to drain 25052 1726882493.27987: waiting for pending results... 25052 1726882493.28411: running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 25052 1726882493.28422: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d2 25052 1726882493.28426: variable 'ansible_search_path' from source: unknown 25052 1726882493.28429: variable 'ansible_search_path' from source: unknown 25052 1726882493.28431: calling self._execute() 25052 1726882493.28523: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.28533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.28584: variable 'omit' from source: magic vars 25052 1726882493.28911: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.28915: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.29118: variable 'type' from source: play vars 25052 1726882493.29129: variable 'state' from source: include params 25052 1726882493.29136: variable 'interface' from source: play vars 25052 1726882493.29139: variable 'current_interfaces' from source: set_fact 25052 1726882493.29142: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): True 25052 1726882493.29145: variable 'omit' from source: magic vars 25052 1726882493.29198: variable 'omit' from source: magic vars 25052 1726882493.29347: variable 'interface' from source: play vars 25052 1726882493.29352: variable 'omit' from source: magic vars 25052 1726882493.29355: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882493.29368: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882493.29389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882493.29412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882493.29425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882493.29455: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882493.29459: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.29461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.29570: Set connection var ansible_pipelining to False 25052 1726882493.29574: Set connection var ansible_connection to ssh 25052 1726882493.29577: Set connection var ansible_shell_type to sh 25052 1726882493.29582: Set connection var ansible_timeout to 10 25052 1726882493.29590: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882493.29596: Set connection var ansible_shell_executable to /bin/sh 25052 1726882493.29619: variable 'ansible_shell_executable' from source: unknown 25052 1726882493.29622: variable 'ansible_connection' from source: unknown 25052 1726882493.29630: variable 'ansible_module_compression' from source: unknown 25052 1726882493.29632: variable 'ansible_shell_type' from source: unknown 25052 1726882493.29635: variable 'ansible_shell_executable' from source: unknown 25052 1726882493.29638: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.29674: variable 'ansible_pipelining' from source: unknown 25052 1726882493.29678: variable 'ansible_timeout' from source: unknown 25052 1726882493.29680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.29786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882493.29799: variable 'omit' from source: magic vars 25052 1726882493.29802: starting attempt loop 25052 1726882493.29805: running the handler 25052 1726882493.29896: _low_level_execute_command(): starting 25052 1726882493.29900: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882493.30615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.30654: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.30668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882493.30687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.30775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.32418: stdout chunk (state=3): >>>/root <<< 25052 1726882493.32572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.32575: stdout chunk (state=3): >>><<< 25052 1726882493.32578: stderr chunk (state=3): >>><<< 25052 1726882493.32602: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882493.32621: _low_level_execute_command(): starting 25052 1726882493.32631: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120 `" && echo ansible-tmp-1726882493.3260965-26481-12063614242120="` echo /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120 `" ) && sleep 0' 25052 1726882493.33287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882493.33308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882493.33332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882493.33358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882493.33471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882493.33496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.33597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.35468: stdout chunk (state=3): >>>ansible-tmp-1726882493.3260965-26481-12063614242120=/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120 <<< 25052 1726882493.35707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.35711: stdout chunk (state=3): >>><<< 25052 1726882493.35714: stderr chunk (state=3): >>><<< 25052 1726882493.35716: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882493.3260965-26481-12063614242120=/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882493.35719: variable 'ansible_module_compression' from source: unknown 25052 1726882493.35745: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882493.35786: variable 'ansible_facts' from source: unknown 25052 1726882493.35887: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py 25052 1726882493.36048: Sending initial data 25052 1726882493.36171: Sent initial data (155 bytes) 25052 1726882493.36808: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.36863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.36877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882493.36906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.37070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.38610: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882493.38699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882493.38751: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpmt9nsaws /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py <<< 25052 1726882493.38754: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py" <<< 25052 1726882493.38839: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpmt9nsaws" to remote "/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py" <<< 25052 1726882493.40475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.40628: stderr chunk (state=3): >>><<< 25052 1726882493.40631: stdout chunk (state=3): >>><<< 25052 1726882493.40634: done transferring module to remote 25052 1726882493.40636: _low_level_execute_command(): starting 25052 1726882493.40638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/ /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py && sleep 0' 25052 1726882493.41282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882493.41300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882493.41315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882493.41331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882493.41415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.41450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.41480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.41572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.43437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.43448: stdout chunk (state=3): >>><<< 25052 1726882493.43460: stderr chunk (state=3): >>><<< 25052 1726882493.43909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882493.43912: _low_level_execute_command(): starting 25052 1726882493.43916: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/AnsiballZ_command.py && sleep 0' 25052 1726882493.45318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.45341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882493.45359: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882493.45417: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.45489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.45588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.45691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.61660: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:34:53.604097", "end": "2024-09-20 21:34:53.612497", "delta": "0:00:00.008400", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882493.63716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882493.63777: stdout chunk (state=3): >>><<< 25052 1726882493.63790: stderr chunk (state=3): >>><<< 25052 1726882493.64106: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "veth0", "type", "veth"], "start": "2024-09-20 21:34:53.604097", "end": "2024-09-20 21:34:53.612497", "delta": "0:00:00.008400", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del veth0 type veth", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882493.64112: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del veth0 type veth', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882493.64115: _low_level_execute_command(): starting 25052 1726882493.64118: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882493.3260965-26481-12063614242120/ > /dev/null 2>&1 && sleep 0' 25052 1726882493.65535: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882493.65591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882493.65605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882493.65927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882493.66012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882493.68325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882493.68329: stderr chunk (state=3): >>><<< 25052 1726882493.68331: stdout chunk (state=3): >>><<< 25052 1726882493.68333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882493.68336: handler run complete 25052 1726882493.68338: Evaluated conditional (False): False 25052 1726882493.68340: attempt loop complete, returning result 25052 1726882493.68342: _execute() done 25052 1726882493.68344: dumping result to json 25052 1726882493.68345: done dumping result, returning 25052 1726882493.68474: done running TaskExecutor() for managed_node2/TASK: Delete veth interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d2] 25052 1726882493.68477: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d2 25052 1726882493.68798: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d2 25052 1726882493.68802: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "veth0", "type", "veth" ], "delta": "0:00:00.008400", "end": "2024-09-20 21:34:53.612497", "rc": 0, "start": "2024-09-20 21:34:53.604097" } 25052 1726882493.68953: no more pending results, returning what we have 25052 1726882493.68956: results queue empty 25052 1726882493.68957: checking for any_errors_fatal 25052 1726882493.68962: done checking for any_errors_fatal 25052 1726882493.68962: checking for max_fail_percentage 25052 1726882493.68964: done checking for max_fail_percentage 25052 1726882493.68965: checking to see if all hosts have failed and the running result is not ok 25052 1726882493.68966: done checking to see if all hosts have failed 25052 1726882493.68966: getting the remaining hosts for this loop 25052 1726882493.68967: done getting the remaining hosts for this loop 25052 1726882493.68971: getting the next task for host managed_node2 25052 1726882493.68977: done getting next task for host managed_node2 25052 1726882493.68980: ^ task is: TASK: Create dummy interface {{ interface }} 25052 1726882493.68983: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882493.68987: getting variables 25052 1726882493.68988: in VariableManager get_vars() 25052 1726882493.69033: Calling all_inventory to load vars for managed_node2 25052 1726882493.69037: Calling groups_inventory to load vars for managed_node2 25052 1726882493.69039: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882493.69048: Calling all_plugins_play to load vars for managed_node2 25052 1726882493.69050: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882493.69052: Calling groups_plugins_play to load vars for managed_node2 25052 1726882493.72215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882493.75927: done with get_vars() 25052 1726882493.76073: done getting variables 25052 1726882493.76142: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882493.76367: variable 'interface' from source: play vars TASK [Create dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:34:53 -0400 (0:00:00.488) 0:00:30.720 ****** 25052 1726882493.76527: entering _queue_task() for managed_node2/command 25052 1726882493.77268: worker is 1 (out of 1 available) 25052 1726882493.77281: exiting _queue_task() for managed_node2/command 25052 1726882493.77413: done queuing things up, now waiting for results queue to drain 25052 1726882493.77414: waiting for pending results... 25052 1726882493.77808: running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 25052 1726882493.78049: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d3 25052 1726882493.78073: variable 'ansible_search_path' from source: unknown 25052 1726882493.78081: variable 'ansible_search_path' from source: unknown 25052 1726882493.78121: calling self._execute() 25052 1726882493.78252: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.78371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.78387: variable 'omit' from source: magic vars 25052 1726882493.79483: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.79777: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.79884: variable 'type' from source: play vars 25052 1726882493.80030: variable 'state' from source: include params 25052 1726882493.80068: variable 'interface' from source: play vars 25052 1726882493.80078: variable 'current_interfaces' from source: set_fact 25052 1726882493.80090: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 25052 1726882493.80135: when evaluation is False, skipping this task 25052 1726882493.80143: _execute() done 25052 1726882493.80150: dumping result to json 25052 1726882493.80159: done dumping result, returning 25052 1726882493.80174: done running TaskExecutor() for managed_node2/TASK: Create dummy interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d3] 25052 1726882493.80347: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d3 25052 1726882493.80418: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d3 25052 1726882493.80422: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882493.80478: no more pending results, returning what we have 25052 1726882493.80483: results queue empty 25052 1726882493.80484: checking for any_errors_fatal 25052 1726882493.80499: done checking for any_errors_fatal 25052 1726882493.80500: checking for max_fail_percentage 25052 1726882493.80502: done checking for max_fail_percentage 25052 1726882493.80503: checking to see if all hosts have failed and the running result is not ok 25052 1726882493.80504: done checking to see if all hosts have failed 25052 1726882493.80505: getting the remaining hosts for this loop 25052 1726882493.80506: done getting the remaining hosts for this loop 25052 1726882493.80510: getting the next task for host managed_node2 25052 1726882493.80517: done getting next task for host managed_node2 25052 1726882493.80521: ^ task is: TASK: Delete dummy interface {{ interface }} 25052 1726882493.80525: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882493.80530: getting variables 25052 1726882493.80532: in VariableManager get_vars() 25052 1726882493.80576: Calling all_inventory to load vars for managed_node2 25052 1726882493.80578: Calling groups_inventory to load vars for managed_node2 25052 1726882493.80580: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882493.80848: Calling all_plugins_play to load vars for managed_node2 25052 1726882493.80851: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882493.80854: Calling groups_plugins_play to load vars for managed_node2 25052 1726882493.83905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882493.97555: done with get_vars() 25052 1726882493.97583: done getting variables 25052 1726882493.97755: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882493.98086: variable 'interface' from source: play vars TASK [Delete dummy interface veth0] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:34:53 -0400 (0:00:00.215) 0:00:30.935 ****** 25052 1726882493.98113: entering _queue_task() for managed_node2/command 25052 1726882493.98645: worker is 1 (out of 1 available) 25052 1726882493.98656: exiting _queue_task() for managed_node2/command 25052 1726882493.98667: done queuing things up, now waiting for results queue to drain 25052 1726882493.98669: waiting for pending results... 25052 1726882493.98874: running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 25052 1726882493.99056: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d4 25052 1726882493.99060: variable 'ansible_search_path' from source: unknown 25052 1726882493.99063: variable 'ansible_search_path' from source: unknown 25052 1726882493.99065: calling self._execute() 25052 1726882493.99167: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882493.99178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882493.99195: variable 'omit' from source: magic vars 25052 1726882493.99601: variable 'ansible_distribution_major_version' from source: facts 25052 1726882493.99620: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882493.99919: variable 'type' from source: play vars 25052 1726882493.99923: variable 'state' from source: include params 25052 1726882493.99925: variable 'interface' from source: play vars 25052 1726882493.99928: variable 'current_interfaces' from source: set_fact 25052 1726882493.99931: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 25052 1726882493.99933: when evaluation is False, skipping this task 25052 1726882493.99935: _execute() done 25052 1726882493.99937: dumping result to json 25052 1726882493.99939: done dumping result, returning 25052 1726882493.99941: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d4] 25052 1726882493.99943: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d4 25052 1726882494.00008: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d4 25052 1726882494.00011: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882494.00070: no more pending results, returning what we have 25052 1726882494.00074: results queue empty 25052 1726882494.00075: checking for any_errors_fatal 25052 1726882494.00082: done checking for any_errors_fatal 25052 1726882494.00083: checking for max_fail_percentage 25052 1726882494.00085: done checking for max_fail_percentage 25052 1726882494.00086: checking to see if all hosts have failed and the running result is not ok 25052 1726882494.00086: done checking to see if all hosts have failed 25052 1726882494.00087: getting the remaining hosts for this loop 25052 1726882494.00088: done getting the remaining hosts for this loop 25052 1726882494.00092: getting the next task for host managed_node2 25052 1726882494.00109: done getting next task for host managed_node2 25052 1726882494.00112: ^ task is: TASK: Create tap interface {{ interface }} 25052 1726882494.00116: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882494.00121: getting variables 25052 1726882494.00123: in VariableManager get_vars() 25052 1726882494.00239: Calling all_inventory to load vars for managed_node2 25052 1726882494.00242: Calling groups_inventory to load vars for managed_node2 25052 1726882494.00245: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.00258: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.00261: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.00265: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.02074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.03949: done with get_vars() 25052 1726882494.03970: done getting variables 25052 1726882494.04036: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882494.04131: variable 'interface' from source: play vars TASK [Create tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:34:54 -0400 (0:00:00.060) 0:00:30.996 ****** 25052 1726882494.04155: entering _queue_task() for managed_node2/command 25052 1726882494.04553: worker is 1 (out of 1 available) 25052 1726882494.04566: exiting _queue_task() for managed_node2/command 25052 1726882494.04578: done queuing things up, now waiting for results queue to drain 25052 1726882494.04579: waiting for pending results... 25052 1726882494.04840: running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 25052 1726882494.04997: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d5 25052 1726882494.05005: variable 'ansible_search_path' from source: unknown 25052 1726882494.05008: variable 'ansible_search_path' from source: unknown 25052 1726882494.05039: calling self._execute() 25052 1726882494.05222: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.05225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.05228: variable 'omit' from source: magic vars 25052 1726882494.05766: variable 'ansible_distribution_major_version' from source: facts 25052 1726882494.05801: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882494.06241: variable 'type' from source: play vars 25052 1726882494.06244: variable 'state' from source: include params 25052 1726882494.06246: variable 'interface' from source: play vars 25052 1726882494.06249: variable 'current_interfaces' from source: set_fact 25052 1726882494.06252: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 25052 1726882494.06254: when evaluation is False, skipping this task 25052 1726882494.06256: _execute() done 25052 1726882494.06258: dumping result to json 25052 1726882494.06260: done dumping result, returning 25052 1726882494.06262: done running TaskExecutor() for managed_node2/TASK: Create tap interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d5] 25052 1726882494.06265: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d5 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882494.06443: no more pending results, returning what we have 25052 1726882494.06447: results queue empty 25052 1726882494.06447: checking for any_errors_fatal 25052 1726882494.06456: done checking for any_errors_fatal 25052 1726882494.06457: checking for max_fail_percentage 25052 1726882494.06459: done checking for max_fail_percentage 25052 1726882494.06460: checking to see if all hosts have failed and the running result is not ok 25052 1726882494.06461: done checking to see if all hosts have failed 25052 1726882494.06461: getting the remaining hosts for this loop 25052 1726882494.06462: done getting the remaining hosts for this loop 25052 1726882494.06465: getting the next task for host managed_node2 25052 1726882494.06472: done getting next task for host managed_node2 25052 1726882494.06475: ^ task is: TASK: Delete tap interface {{ interface }} 25052 1726882494.06478: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=5, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882494.06482: getting variables 25052 1726882494.06484: in VariableManager get_vars() 25052 1726882494.06678: Calling all_inventory to load vars for managed_node2 25052 1726882494.06681: Calling groups_inventory to load vars for managed_node2 25052 1726882494.06684: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.06695: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d5 25052 1726882494.06698: WORKER PROCESS EXITING 25052 1726882494.06707: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.06709: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.06712: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.07803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.09220: done with get_vars() 25052 1726882494.09242: done getting variables 25052 1726882494.09315: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 25052 1726882494.09432: variable 'interface' from source: play vars TASK [Delete tap interface veth0] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:34:54 -0400 (0:00:00.053) 0:00:31.049 ****** 25052 1726882494.09464: entering _queue_task() for managed_node2/command 25052 1726882494.09841: worker is 1 (out of 1 available) 25052 1726882494.09856: exiting _queue_task() for managed_node2/command 25052 1726882494.09868: done queuing things up, now waiting for results queue to drain 25052 1726882494.09869: waiting for pending results... 25052 1726882494.10158: running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 25052 1726882494.10178: in run() - task 12673a56-9f93-f7f6-4a6d-0000000005d6 25052 1726882494.10190: variable 'ansible_search_path' from source: unknown 25052 1726882494.10198: variable 'ansible_search_path' from source: unknown 25052 1726882494.10232: calling self._execute() 25052 1726882494.10335: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.10363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.10367: variable 'omit' from source: magic vars 25052 1726882494.10723: variable 'ansible_distribution_major_version' from source: facts 25052 1726882494.10802: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882494.10974: variable 'type' from source: play vars 25052 1726882494.10978: variable 'state' from source: include params 25052 1726882494.10984: variable 'interface' from source: play vars 25052 1726882494.10987: variable 'current_interfaces' from source: set_fact 25052 1726882494.11002: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 25052 1726882494.11005: when evaluation is False, skipping this task 25052 1726882494.11008: _execute() done 25052 1726882494.11010: dumping result to json 25052 1726882494.11014: done dumping result, returning 25052 1726882494.11021: done running TaskExecutor() for managed_node2/TASK: Delete tap interface veth0 [12673a56-9f93-f7f6-4a6d-0000000005d6] 25052 1726882494.11024: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d6 25052 1726882494.11183: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000005d6 25052 1726882494.11186: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 25052 1726882494.11238: no more pending results, returning what we have 25052 1726882494.11241: results queue empty 25052 1726882494.11242: checking for any_errors_fatal 25052 1726882494.11245: done checking for any_errors_fatal 25052 1726882494.11246: checking for max_fail_percentage 25052 1726882494.11247: done checking for max_fail_percentage 25052 1726882494.11248: checking to see if all hosts have failed and the running result is not ok 25052 1726882494.11249: done checking to see if all hosts have failed 25052 1726882494.11249: getting the remaining hosts for this loop 25052 1726882494.11251: done getting the remaining hosts for this loop 25052 1726882494.11253: getting the next task for host managed_node2 25052 1726882494.11260: done getting next task for host managed_node2 25052 1726882494.11261: ^ task is: TASK: Clean up namespace 25052 1726882494.11264: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=6, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882494.11266: getting variables 25052 1726882494.11268: in VariableManager get_vars() 25052 1726882494.11368: Calling all_inventory to load vars for managed_node2 25052 1726882494.11372: Calling groups_inventory to load vars for managed_node2 25052 1726882494.11374: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.11384: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.11386: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.11389: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.12690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.15687: done with get_vars() 25052 1726882494.15716: done getting variables 25052 1726882494.15783: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Clean up namespace] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:108 Friday 20 September 2024 21:34:54 -0400 (0:00:00.063) 0:00:31.113 ****** 25052 1726882494.15871: entering _queue_task() for managed_node2/command 25052 1726882494.16641: worker is 1 (out of 1 available) 25052 1726882494.16657: exiting _queue_task() for managed_node2/command 25052 1726882494.16671: done queuing things up, now waiting for results queue to drain 25052 1726882494.16673: waiting for pending results... 25052 1726882494.17197: running TaskExecutor() for managed_node2/TASK: Clean up namespace 25052 1726882494.17499: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000b4 25052 1726882494.17504: variable 'ansible_search_path' from source: unknown 25052 1726882494.17508: calling self._execute() 25052 1726882494.17511: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.17514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.17517: variable 'omit' from source: magic vars 25052 1726882494.17961: variable 'ansible_distribution_major_version' from source: facts 25052 1726882494.17965: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882494.17971: variable 'omit' from source: magic vars 25052 1726882494.17974: variable 'omit' from source: magic vars 25052 1726882494.17976: variable 'omit' from source: magic vars 25052 1726882494.18019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882494.18070: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882494.18094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882494.18110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882494.18122: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882494.18178: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882494.18181: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.18184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.18508: Set connection var ansible_pipelining to False 25052 1726882494.18515: Set connection var ansible_connection to ssh 25052 1726882494.18520: Set connection var ansible_shell_type to sh 25052 1726882494.18523: Set connection var ansible_timeout to 10 25052 1726882494.18525: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882494.18527: Set connection var ansible_shell_executable to /bin/sh 25052 1726882494.18530: variable 'ansible_shell_executable' from source: unknown 25052 1726882494.18532: variable 'ansible_connection' from source: unknown 25052 1726882494.18538: variable 'ansible_module_compression' from source: unknown 25052 1726882494.18542: variable 'ansible_shell_type' from source: unknown 25052 1726882494.18544: variable 'ansible_shell_executable' from source: unknown 25052 1726882494.18546: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.18548: variable 'ansible_pipelining' from source: unknown 25052 1726882494.18550: variable 'ansible_timeout' from source: unknown 25052 1726882494.18552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.18559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882494.18724: variable 'omit' from source: magic vars 25052 1726882494.18727: starting attempt loop 25052 1726882494.18730: running the handler 25052 1726882494.18732: _low_level_execute_command(): starting 25052 1726882494.18734: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882494.19721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.19730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.19751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.19764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.19869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.21498: stdout chunk (state=3): >>>/root <<< 25052 1726882494.21653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.21663: stdout chunk (state=3): >>><<< 25052 1726882494.21669: stderr chunk (state=3): >>><<< 25052 1726882494.21719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.21736: _low_level_execute_command(): starting 25052 1726882494.21745: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696 `" && echo ansible-tmp-1726882494.2171807-26531-57797258681696="` echo /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696 `" ) && sleep 0' 25052 1726882494.23152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.23272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.23304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.23367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.25260: stdout chunk (state=3): >>>ansible-tmp-1726882494.2171807-26531-57797258681696=/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696 <<< 25052 1726882494.25427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.25430: stdout chunk (state=3): >>><<< 25052 1726882494.25432: stderr chunk (state=3): >>><<< 25052 1726882494.25600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882494.2171807-26531-57797258681696=/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.25603: variable 'ansible_module_compression' from source: unknown 25052 1726882494.25606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882494.25608: variable 'ansible_facts' from source: unknown 25052 1726882494.25808: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py 25052 1726882494.26223: Sending initial data 25052 1726882494.26287: Sent initial data (155 bytes) 25052 1726882494.27926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882494.28034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.28038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.28068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.28197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.29719: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 25052 1726882494.29758: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 25052 1726882494.29766: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882494.29825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882494.29886: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpxy7jk__3 /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py <<< 25052 1726882494.29899: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py" <<< 25052 1726882494.29952: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpxy7jk__3" to remote "/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py" <<< 25052 1726882494.31237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.31241: stderr chunk (state=3): >>><<< 25052 1726882494.31243: stdout chunk (state=3): >>><<< 25052 1726882494.31245: done transferring module to remote 25052 1726882494.31247: _low_level_execute_command(): starting 25052 1726882494.31249: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/ /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py && sleep 0' 25052 1726882494.32109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.32122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882494.32134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882494.32163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882494.32271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.32378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.32438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.34212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.34263: stdout chunk (state=3): >>><<< 25052 1726882494.34267: stderr chunk (state=3): >>><<< 25052 1726882494.34376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.34380: _low_level_execute_command(): starting 25052 1726882494.34382: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/AnsiballZ_command.py && sleep 0' 25052 1726882494.35596: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.35681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.35753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.35817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.35871: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.36024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.51548: stdout chunk (state=3): >>> <<< 25052 1726882494.51553: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:34:54.509375", "end": "2024-09-20 21:34:54.513802", "delta": "0:00:00.004427", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882494.52874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882494.52906: stderr chunk (state=3): >>><<< 25052 1726882494.52909: stdout chunk (state=3): >>><<< 25052 1726882494.52930: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "netns", "delete", "ns1"], "start": "2024-09-20 21:34:54.509375", "end": "2024-09-20 21:34:54.513802", "delta": "0:00:00.004427", "msg": "", "invocation": {"module_args": {"_raw_params": "ip netns delete ns1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882494.52958: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip netns delete ns1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882494.52966: _low_level_execute_command(): starting 25052 1726882494.52971: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882494.2171807-26531-57797258681696/ > /dev/null 2>&1 && sleep 0' 25052 1726882494.53545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.53622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.53636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.53686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.55489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.55519: stderr chunk (state=3): >>><<< 25052 1726882494.55528: stdout chunk (state=3): >>><<< 25052 1726882494.55539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.55544: handler run complete 25052 1726882494.55562: Evaluated conditional (False): False 25052 1726882494.55571: attempt loop complete, returning result 25052 1726882494.55574: _execute() done 25052 1726882494.55577: dumping result to json 25052 1726882494.55582: done dumping result, returning 25052 1726882494.55595: done running TaskExecutor() for managed_node2/TASK: Clean up namespace [12673a56-9f93-f7f6-4a6d-0000000000b4] 25052 1726882494.55598: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b4 25052 1726882494.55700: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b4 25052 1726882494.55703: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "netns", "delete", "ns1" ], "delta": "0:00:00.004427", "end": "2024-09-20 21:34:54.513802", "rc": 0, "start": "2024-09-20 21:34:54.509375" } 25052 1726882494.55760: no more pending results, returning what we have 25052 1726882494.55764: results queue empty 25052 1726882494.55764: checking for any_errors_fatal 25052 1726882494.55769: done checking for any_errors_fatal 25052 1726882494.55770: checking for max_fail_percentage 25052 1726882494.55772: done checking for max_fail_percentage 25052 1726882494.55773: checking to see if all hosts have failed and the running result is not ok 25052 1726882494.55773: done checking to see if all hosts have failed 25052 1726882494.55774: getting the remaining hosts for this loop 25052 1726882494.55775: done getting the remaining hosts for this loop 25052 1726882494.55778: getting the next task for host managed_node2 25052 1726882494.55783: done getting next task for host managed_node2 25052 1726882494.55786: ^ task is: TASK: Verify network state restored to default 25052 1726882494.55788: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=7, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882494.55796: getting variables 25052 1726882494.55797: in VariableManager get_vars() 25052 1726882494.55837: Calling all_inventory to load vars for managed_node2 25052 1726882494.55840: Calling groups_inventory to load vars for managed_node2 25052 1726882494.55842: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.55853: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.55855: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.55858: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.57056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.58585: done with get_vars() 25052 1726882494.58608: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:113 Friday 20 September 2024 21:34:54 -0400 (0:00:00.428) 0:00:31.541 ****** 25052 1726882494.58675: entering _queue_task() for managed_node2/include_tasks 25052 1726882494.58942: worker is 1 (out of 1 available) 25052 1726882494.58956: exiting _queue_task() for managed_node2/include_tasks 25052 1726882494.58967: done queuing things up, now waiting for results queue to drain 25052 1726882494.58968: waiting for pending results... 25052 1726882494.59334: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 25052 1726882494.59442: in run() - task 12673a56-9f93-f7f6-4a6d-0000000000b5 25052 1726882494.59453: variable 'ansible_search_path' from source: unknown 25052 1726882494.59489: calling self._execute() 25052 1726882494.59601: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.59616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.59627: variable 'omit' from source: magic vars 25052 1726882494.59974: variable 'ansible_distribution_major_version' from source: facts 25052 1726882494.59986: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882494.59990: _execute() done 25052 1726882494.60001: dumping result to json 25052 1726882494.60004: done dumping result, returning 25052 1726882494.60007: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [12673a56-9f93-f7f6-4a6d-0000000000b5] 25052 1726882494.60012: sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b5 25052 1726882494.60095: done sending task result for task 12673a56-9f93-f7f6-4a6d-0000000000b5 25052 1726882494.60098: WORKER PROCESS EXITING 25052 1726882494.60124: no more pending results, returning what we have 25052 1726882494.60129: in VariableManager get_vars() 25052 1726882494.60175: Calling all_inventory to load vars for managed_node2 25052 1726882494.60177: Calling groups_inventory to load vars for managed_node2 25052 1726882494.60179: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.60192: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.60197: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.60200: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.61452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.63064: done with get_vars() 25052 1726882494.63078: variable 'ansible_search_path' from source: unknown 25052 1726882494.63089: we have included files to process 25052 1726882494.63089: generating all_blocks data 25052 1726882494.63094: done generating all_blocks data 25052 1726882494.63098: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25052 1726882494.63099: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25052 1726882494.63101: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 25052 1726882494.63368: done processing included file 25052 1726882494.63369: iterating over new_blocks loaded from include file 25052 1726882494.63370: in VariableManager get_vars() 25052 1726882494.63385: done with get_vars() 25052 1726882494.63386: filtering new block on tags 25052 1726882494.63403: done filtering new block on tags 25052 1726882494.63404: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 25052 1726882494.63408: extending task lists for all hosts with included blocks 25052 1726882494.65789: done extending task lists 25052 1726882494.65795: done processing included files 25052 1726882494.65796: results queue empty 25052 1726882494.65796: checking for any_errors_fatal 25052 1726882494.65801: done checking for any_errors_fatal 25052 1726882494.65801: checking for max_fail_percentage 25052 1726882494.65803: done checking for max_fail_percentage 25052 1726882494.65803: checking to see if all hosts have failed and the running result is not ok 25052 1726882494.65804: done checking to see if all hosts have failed 25052 1726882494.65805: getting the remaining hosts for this loop 25052 1726882494.65806: done getting the remaining hosts for this loop 25052 1726882494.65808: getting the next task for host managed_node2 25052 1726882494.65812: done getting next task for host managed_node2 25052 1726882494.65815: ^ task is: TASK: Check routes and DNS 25052 1726882494.65817: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882494.65820: getting variables 25052 1726882494.65821: in VariableManager get_vars() 25052 1726882494.65841: Calling all_inventory to load vars for managed_node2 25052 1726882494.65843: Calling groups_inventory to load vars for managed_node2 25052 1726882494.65845: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882494.65852: Calling all_plugins_play to load vars for managed_node2 25052 1726882494.65861: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882494.65864: Calling groups_plugins_play to load vars for managed_node2 25052 1726882494.67129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882494.68649: done with get_vars() 25052 1726882494.68666: done getting variables 25052 1726882494.68704: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:34:54 -0400 (0:00:00.100) 0:00:31.642 ****** 25052 1726882494.68726: entering _queue_task() for managed_node2/shell 25052 1726882494.68987: worker is 1 (out of 1 available) 25052 1726882494.69002: exiting _queue_task() for managed_node2/shell 25052 1726882494.69014: done queuing things up, now waiting for results queue to drain 25052 1726882494.69016: waiting for pending results... 25052 1726882494.69189: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 25052 1726882494.69260: in run() - task 12673a56-9f93-f7f6-4a6d-00000000075e 25052 1726882494.69272: variable 'ansible_search_path' from source: unknown 25052 1726882494.69276: variable 'ansible_search_path' from source: unknown 25052 1726882494.69307: calling self._execute() 25052 1726882494.69381: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.69385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.69397: variable 'omit' from source: magic vars 25052 1726882494.69668: variable 'ansible_distribution_major_version' from source: facts 25052 1726882494.69680: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882494.69683: variable 'omit' from source: magic vars 25052 1726882494.69717: variable 'omit' from source: magic vars 25052 1726882494.69741: variable 'omit' from source: magic vars 25052 1726882494.69771: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882494.69802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882494.69820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882494.69832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882494.69842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882494.69866: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882494.69869: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.69872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.69945: Set connection var ansible_pipelining to False 25052 1726882494.69948: Set connection var ansible_connection to ssh 25052 1726882494.69950: Set connection var ansible_shell_type to sh 25052 1726882494.69956: Set connection var ansible_timeout to 10 25052 1726882494.69963: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882494.69967: Set connection var ansible_shell_executable to /bin/sh 25052 1726882494.69983: variable 'ansible_shell_executable' from source: unknown 25052 1726882494.69986: variable 'ansible_connection' from source: unknown 25052 1726882494.69989: variable 'ansible_module_compression' from source: unknown 25052 1726882494.69996: variable 'ansible_shell_type' from source: unknown 25052 1726882494.69999: variable 'ansible_shell_executable' from source: unknown 25052 1726882494.70003: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882494.70006: variable 'ansible_pipelining' from source: unknown 25052 1726882494.70008: variable 'ansible_timeout' from source: unknown 25052 1726882494.70011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882494.70133: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882494.70153: variable 'omit' from source: magic vars 25052 1726882494.70157: starting attempt loop 25052 1726882494.70159: running the handler 25052 1726882494.70162: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882494.70277: _low_level_execute_command(): starting 25052 1726882494.70280: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882494.70924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.70946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.70969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.70976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.71090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.72691: stdout chunk (state=3): >>>/root <<< 25052 1726882494.72789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.72832: stderr chunk (state=3): >>><<< 25052 1726882494.72834: stdout chunk (state=3): >>><<< 25052 1726882494.72854: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.72870: _low_level_execute_command(): starting 25052 1726882494.72876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549 `" && echo ansible-tmp-1726882494.7285843-26554-159646780064549="` echo /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549 `" ) && sleep 0' 25052 1726882494.73495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 25052 1726882494.73507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.73510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.73546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.73620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.75529: stdout chunk (state=3): >>>ansible-tmp-1726882494.7285843-26554-159646780064549=/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549 <<< 25052 1726882494.75684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.75687: stdout chunk (state=3): >>><<< 25052 1726882494.75690: stderr chunk (state=3): >>><<< 25052 1726882494.75901: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882494.7285843-26554-159646780064549=/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.75905: variable 'ansible_module_compression' from source: unknown 25052 1726882494.75907: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882494.75909: variable 'ansible_facts' from source: unknown 25052 1726882494.75956: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py 25052 1726882494.76141: Sending initial data 25052 1726882494.76151: Sent initial data (156 bytes) 25052 1726882494.76834: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.76901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.76964: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.76980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.77021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.77124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.78672: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882494.78747: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882494.78810: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmp981c9t0t /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py <<< 25052 1726882494.78813: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py" <<< 25052 1726882494.78871: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmp981c9t0t" to remote "/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py" <<< 25052 1726882494.79713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.79778: stderr chunk (state=3): >>><<< 25052 1726882494.79855: stdout chunk (state=3): >>><<< 25052 1726882494.79865: done transferring module to remote 25052 1726882494.79882: _low_level_execute_command(): starting 25052 1726882494.79897: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/ /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py && sleep 0' 25052 1726882494.80572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.80585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882494.80606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882494.80730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.80758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.80853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.82624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882494.82653: stdout chunk (state=3): >>><<< 25052 1726882494.82656: stderr chunk (state=3): >>><<< 25052 1726882494.82673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882494.82699: _low_level_execute_command(): starting 25052 1726882494.82703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/AnsiballZ_command.py && sleep 0' 25052 1726882494.83369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882494.83407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.83418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 25052 1726882494.83464: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882494.83477: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882494.83532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882494.83551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882494.83590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882494.83695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882494.99370: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3176sec preferred_lft 3176sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:54.984533", "end": "2024-09-20 21:34:54.992692", "delta": "0:00:00.008159", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882495.00805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882495.00810: stdout chunk (state=3): >>><<< 25052 1726882495.00823: stderr chunk (state=3): >>><<< 25052 1726882495.00899: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3176sec preferred_lft 3176sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:34:54.984533", "end": "2024-09-20 21:34:54.992692", "delta": "0:00:00.008159", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882495.01000: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882495.01004: _low_level_execute_command(): starting 25052 1726882495.01006: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882494.7285843-26554-159646780064549/ > /dev/null 2>&1 && sleep 0' 25052 1726882495.01578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 25052 1726882495.01599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882495.01619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882495.01641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882495.01762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882495.01801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.01898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.03741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.03767: stdout chunk (state=3): >>><<< 25052 1726882495.03783: stderr chunk (state=3): >>><<< 25052 1726882495.03805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882495.03812: handler run complete 25052 1726882495.03858: Evaluated conditional (False): False 25052 1726882495.03884: attempt loop complete, returning result 25052 1726882495.03888: _execute() done 25052 1726882495.03890: dumping result to json 25052 1726882495.03909: done dumping result, returning 25052 1726882495.03912: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [12673a56-9f93-f7f6-4a6d-00000000075e] 25052 1726882495.03915: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000075e 25052 1726882495.04024: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000075e 25052 1726882495.04027: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008159", "end": "2024-09-20 21:34:54.992692", "rc": 0, "start": "2024-09-20 21:34:54.984533" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3176sec preferred_lft 3176sec inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 25052 1726882495.04095: no more pending results, returning what we have 25052 1726882495.04099: results queue empty 25052 1726882495.04099: checking for any_errors_fatal 25052 1726882495.04101: done checking for any_errors_fatal 25052 1726882495.04101: checking for max_fail_percentage 25052 1726882495.04103: done checking for max_fail_percentage 25052 1726882495.04104: checking to see if all hosts have failed and the running result is not ok 25052 1726882495.04105: done checking to see if all hosts have failed 25052 1726882495.04105: getting the remaining hosts for this loop 25052 1726882495.04107: done getting the remaining hosts for this loop 25052 1726882495.04109: getting the next task for host managed_node2 25052 1726882495.04116: done getting next task for host managed_node2 25052 1726882495.04119: ^ task is: TASK: Verify DNS and network connectivity 25052 1726882495.04121: ^ state is: HOST STATE: block=3, task=15, rescue=0, always=8, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 25052 1726882495.04125: getting variables 25052 1726882495.04127: in VariableManager get_vars() 25052 1726882495.04166: Calling all_inventory to load vars for managed_node2 25052 1726882495.04168: Calling groups_inventory to load vars for managed_node2 25052 1726882495.04175: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882495.04184: Calling all_plugins_play to load vars for managed_node2 25052 1726882495.04187: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882495.04189: Calling groups_plugins_play to load vars for managed_node2 25052 1726882495.05826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882495.07164: done with get_vars() 25052 1726882495.07195: done getting variables 25052 1726882495.07281: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:34:55 -0400 (0:00:00.385) 0:00:32.027 ****** 25052 1726882495.07308: entering _queue_task() for managed_node2/shell 25052 1726882495.07589: worker is 1 (out of 1 available) 25052 1726882495.07605: exiting _queue_task() for managed_node2/shell 25052 1726882495.07616: done queuing things up, now waiting for results queue to drain 25052 1726882495.07618: waiting for pending results... 25052 1726882495.07817: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 25052 1726882495.07929: in run() - task 12673a56-9f93-f7f6-4a6d-00000000075f 25052 1726882495.07939: variable 'ansible_search_path' from source: unknown 25052 1726882495.07942: variable 'ansible_search_path' from source: unknown 25052 1726882495.07971: calling self._execute() 25052 1726882495.08080: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882495.08084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882495.08098: variable 'omit' from source: magic vars 25052 1726882495.08448: variable 'ansible_distribution_major_version' from source: facts 25052 1726882495.08451: Evaluated conditional (ansible_distribution_major_version != '6'): True 25052 1726882495.08540: variable 'ansible_facts' from source: unknown 25052 1726882495.08992: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 25052 1726882495.09001: variable 'omit' from source: magic vars 25052 1726882495.09032: variable 'omit' from source: magic vars 25052 1726882495.09053: variable 'omit' from source: magic vars 25052 1726882495.09085: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 25052 1726882495.09120: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 25052 1726882495.09135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 25052 1726882495.09147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882495.09158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 25052 1726882495.09181: variable 'inventory_hostname' from source: host vars for 'managed_node2' 25052 1726882495.09184: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882495.09187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882495.09262: Set connection var ansible_pipelining to False 25052 1726882495.09265: Set connection var ansible_connection to ssh 25052 1726882495.09267: Set connection var ansible_shell_type to sh 25052 1726882495.09273: Set connection var ansible_timeout to 10 25052 1726882495.09279: Set connection var ansible_module_compression to ZIP_DEFLATED 25052 1726882495.09284: Set connection var ansible_shell_executable to /bin/sh 25052 1726882495.09304: variable 'ansible_shell_executable' from source: unknown 25052 1726882495.09309: variable 'ansible_connection' from source: unknown 25052 1726882495.09312: variable 'ansible_module_compression' from source: unknown 25052 1726882495.09314: variable 'ansible_shell_type' from source: unknown 25052 1726882495.09317: variable 'ansible_shell_executable' from source: unknown 25052 1726882495.09319: variable 'ansible_host' from source: host vars for 'managed_node2' 25052 1726882495.09321: variable 'ansible_pipelining' from source: unknown 25052 1726882495.09323: variable 'ansible_timeout' from source: unknown 25052 1726882495.09326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 25052 1726882495.09426: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882495.09436: variable 'omit' from source: magic vars 25052 1726882495.09440: starting attempt loop 25052 1726882495.09443: running the handler 25052 1726882495.09453: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 25052 1726882495.09467: _low_level_execute_command(): starting 25052 1726882495.09475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 25052 1726882495.09987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882495.10027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882495.10031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 25052 1726882495.10036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.10082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882495.10085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882495.10087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.10155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.11736: stdout chunk (state=3): >>>/root <<< 25052 1726882495.11843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.11875: stderr chunk (state=3): >>><<< 25052 1726882495.11878: stdout chunk (state=3): >>><<< 25052 1726882495.11903: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882495.11918: _low_level_execute_command(): starting 25052 1726882495.11923: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566 `" && echo ansible-tmp-1726882495.1190288-26574-150440334265566="` echo /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566 `" ) && sleep 0' 25052 1726882495.12633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882495.12636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.12638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882495.12640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.12687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882495.12691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882495.12697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.12757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.14611: stdout chunk (state=3): >>>ansible-tmp-1726882495.1190288-26574-150440334265566=/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566 <<< 25052 1726882495.14721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.14748: stderr chunk (state=3): >>><<< 25052 1726882495.14751: stdout chunk (state=3): >>><<< 25052 1726882495.14766: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882495.1190288-26574-150440334265566=/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882495.14814: variable 'ansible_module_compression' from source: unknown 25052 1726882495.14879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-25052f9s2671v/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 25052 1726882495.14924: variable 'ansible_facts' from source: unknown 25052 1726882495.14986: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py 25052 1726882495.15178: Sending initial data 25052 1726882495.15181: Sent initial data (156 bytes) 25052 1726882495.15642: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882495.15645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882495.15647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.15649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 25052 1726882495.15651: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882495.15653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.15706: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882495.15710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.15775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.17286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 25052 1726882495.17292: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 25052 1726882495.17346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 25052 1726882495.17410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-25052f9s2671v/tmpq7yudaaa /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py <<< 25052 1726882495.17423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py" <<< 25052 1726882495.17470: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-25052f9s2671v/tmpq7yudaaa" to remote "/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py" <<< 25052 1726882495.17477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py" <<< 25052 1726882495.18209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.18247: stderr chunk (state=3): >>><<< 25052 1726882495.18250: stdout chunk (state=3): >>><<< 25052 1726882495.18302: done transferring module to remote 25052 1726882495.18311: _low_level_execute_command(): starting 25052 1726882495.18316: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/ /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py && sleep 0' 25052 1726882495.18958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.19014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882495.19017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882495.19023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.19082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.20814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.20884: stderr chunk (state=3): >>><<< 25052 1726882495.20887: stdout chunk (state=3): >>><<< 25052 1726882495.20916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882495.20919: _low_level_execute_command(): starting 25052 1726882495.20925: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/AnsiballZ_command.py && sleep 0' 25052 1726882495.21555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 25052 1726882495.21661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 25052 1726882495.21665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882495.21667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.21671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.21720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.21787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.50973: stdout chunk (state=3): >>> <<< 25052 1726882495.50991: stdout chunk (state=3): >>>{"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6469 0 --:--:-- --:--:-- --:--:-- 6489\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3931 0 --:--:-- --:--:-- --:--:-- 3986", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:55.365077", "end": "2024-09-20 21:34:55.508890", "delta": "0:00:00.143813", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 25052 1726882495.52518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 25052 1726882495.52539: stderr chunk (state=3): >>><<< 25052 1726882495.52543: stdout chunk (state=3): >>><<< 25052 1726882495.52606: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6469 0 --:--:-- --:--:-- --:--:-- 6489\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3931 0 --:--:-- --:--:-- --:--:-- 3986", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:34:55.365077", "end": "2024-09-20 21:34:55.508890", "delta": "0:00:00.143813", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 25052 1726882495.52667: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 25052 1726882495.52688: _low_level_execute_command(): starting 25052 1726882495.52696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882495.1190288-26574-150440334265566/ > /dev/null 2>&1 && sleep 0' 25052 1726882495.53376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 25052 1726882495.53379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 25052 1726882495.53391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.53396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 25052 1726882495.53449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 25052 1726882495.53457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 25052 1726882495.53460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 25052 1726882495.53517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 25052 1726882495.55405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 25052 1726882495.55409: stdout chunk (state=3): >>><<< 25052 1726882495.55412: stderr chunk (state=3): >>><<< 25052 1726882495.55414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 25052 1726882495.55416: handler run complete 25052 1726882495.55443: Evaluated conditional (False): False 25052 1726882495.55452: attempt loop complete, returning result 25052 1726882495.55455: _execute() done 25052 1726882495.55457: dumping result to json 25052 1726882495.55510: done dumping result, returning 25052 1726882495.55514: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [12673a56-9f93-f7f6-4a6d-00000000075f] 25052 1726882495.55516: sending task result for task 12673a56-9f93-f7f6-4a6d-00000000075f 25052 1726882495.55586: done sending task result for task 12673a56-9f93-f7f6-4a6d-00000000075f 25052 1726882495.55589: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.143813", "end": "2024-09-20 21:34:55.508890", "rc": 0, "start": "2024-09-20 21:34:55.365077" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6469 0 --:--:-- --:--:-- --:--:-- 6489 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3931 0 --:--:-- --:--:-- --:--:-- 3986 25052 1726882495.55658: no more pending results, returning what we have 25052 1726882495.55661: results queue empty 25052 1726882495.55662: checking for any_errors_fatal 25052 1726882495.55671: done checking for any_errors_fatal 25052 1726882495.55672: checking for max_fail_percentage 25052 1726882495.55673: done checking for max_fail_percentage 25052 1726882495.55674: checking to see if all hosts have failed and the running result is not ok 25052 1726882495.55675: done checking to see if all hosts have failed 25052 1726882495.55676: getting the remaining hosts for this loop 25052 1726882495.55677: done getting the remaining hosts for this loop 25052 1726882495.55684: getting the next task for host managed_node2 25052 1726882495.55698: done getting next task for host managed_node2 25052 1726882495.55700: ^ task is: TASK: meta (flush_handlers) 25052 1726882495.55702: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882495.55706: getting variables 25052 1726882495.55708: in VariableManager get_vars() 25052 1726882495.55748: Calling all_inventory to load vars for managed_node2 25052 1726882495.55750: Calling groups_inventory to load vars for managed_node2 25052 1726882495.55753: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882495.55763: Calling all_plugins_play to load vars for managed_node2 25052 1726882495.55765: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882495.55768: Calling groups_plugins_play to load vars for managed_node2 25052 1726882495.57183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882495.58246: done with get_vars() 25052 1726882495.58264: done getting variables 25052 1726882495.58317: in VariableManager get_vars() 25052 1726882495.58327: Calling all_inventory to load vars for managed_node2 25052 1726882495.58329: Calling groups_inventory to load vars for managed_node2 25052 1726882495.58330: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882495.58334: Calling all_plugins_play to load vars for managed_node2 25052 1726882495.58335: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882495.58337: Calling groups_plugins_play to load vars for managed_node2 25052 1726882495.59061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882495.60323: done with get_vars() 25052 1726882495.60348: done queuing things up, now waiting for results queue to drain 25052 1726882495.60350: results queue empty 25052 1726882495.60351: checking for any_errors_fatal 25052 1726882495.60354: done checking for any_errors_fatal 25052 1726882495.60355: checking for max_fail_percentage 25052 1726882495.60356: done checking for max_fail_percentage 25052 1726882495.60357: checking to see if all hosts have failed and the running result is not ok 25052 1726882495.60357: done checking to see if all hosts have failed 25052 1726882495.60358: getting the remaining hosts for this loop 25052 1726882495.60359: done getting the remaining hosts for this loop 25052 1726882495.60362: getting the next task for host managed_node2 25052 1726882495.60365: done getting next task for host managed_node2 25052 1726882495.60367: ^ task is: TASK: meta (flush_handlers) 25052 1726882495.60368: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882495.60371: getting variables 25052 1726882495.60372: in VariableManager get_vars() 25052 1726882495.60384: Calling all_inventory to load vars for managed_node2 25052 1726882495.60387: Calling groups_inventory to load vars for managed_node2 25052 1726882495.60388: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882495.60395: Calling all_plugins_play to load vars for managed_node2 25052 1726882495.60398: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882495.60401: Calling groups_plugins_play to load vars for managed_node2 25052 1726882495.61502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882495.63085: done with get_vars() 25052 1726882495.63106: done getting variables 25052 1726882495.63159: in VariableManager get_vars() 25052 1726882495.63174: Calling all_inventory to load vars for managed_node2 25052 1726882495.63176: Calling groups_inventory to load vars for managed_node2 25052 1726882495.63179: Calling all_plugins_inventory to load vars for managed_node2 25052 1726882495.63184: Calling all_plugins_play to load vars for managed_node2 25052 1726882495.63186: Calling groups_plugins_inventory to load vars for managed_node2 25052 1726882495.63189: Calling groups_plugins_play to load vars for managed_node2 25052 1726882495.64292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 25052 1726882495.65833: done with get_vars() 25052 1726882495.65859: done queuing things up, now waiting for results queue to drain 25052 1726882495.65861: results queue empty 25052 1726882495.65862: checking for any_errors_fatal 25052 1726882495.65863: done checking for any_errors_fatal 25052 1726882495.65864: checking for max_fail_percentage 25052 1726882495.65865: done checking for max_fail_percentage 25052 1726882495.65865: checking to see if all hosts have failed and the running result is not ok 25052 1726882495.65866: done checking to see if all hosts have failed 25052 1726882495.65867: getting the remaining hosts for this loop 25052 1726882495.65868: done getting the remaining hosts for this loop 25052 1726882495.65871: getting the next task for host managed_node2 25052 1726882495.65874: done getting next task for host managed_node2 25052 1726882495.65875: ^ task is: None 25052 1726882495.65877: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 25052 1726882495.65878: done queuing things up, now waiting for results queue to drain 25052 1726882495.65878: results queue empty 25052 1726882495.65879: checking for any_errors_fatal 25052 1726882495.65880: done checking for any_errors_fatal 25052 1726882495.65880: checking for max_fail_percentage 25052 1726882495.65881: done checking for max_fail_percentage 25052 1726882495.65882: checking to see if all hosts have failed and the running result is not ok 25052 1726882495.65883: done checking to see if all hosts have failed 25052 1726882495.65885: getting the next task for host managed_node2 25052 1726882495.65887: done getting next task for host managed_node2 25052 1726882495.65888: ^ task is: None 25052 1726882495.65889: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=76 changed=2 unreachable=0 failed=0 skipped=62 rescued=0 ignored=0 Friday 20 September 2024 21:34:55 -0400 (0:00:00.588) 0:00:32.616 ****** =============================================================================== fedora.linux_system_roles.network : Configure networking connection profiles --- 3.07s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which services are running ---- 1.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.48s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_ipv6_nm.yml:6 Create veth interface veth0 --------------------------------------------- 1.17s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.86s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Install iproute --------------------------------------------------------- 0.85s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.80s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.79s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Install iproute --------------------------------------------------------- 0.70s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Ensure ping6 command is present ----------------------------------------- 0.69s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ipv6.yml:81 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.63s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather current interface info ------------------------------------------- 0.61s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Verify DNS and network connectivity ------------------------------------- 0.59s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.55s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Delete veth interface veth0 --------------------------------------------- 0.49s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 25052 1726882495.66180: RUNNING CLEANUP